A Flutter plugin for integrating Liquid AI's LEAP SDK, enabling on-device deployment of small language models in Flutter applications.
| Platform | Text Models | Vision Models | Notes |
|---|---|---|---|
| Android | ✅ Fully Supported | ✅ Fully Supported | API 31+, arm64-v8a |
| iOS | ✅ Fully Supported | ✅ Fully Supported | iOS 15+, 64-bit architecture |
- ✅ Model Management: Download, load, unload, and delete models
- ✅ Progress Tracking: Real-time download progress with throttling
- ✅ Text Generation: Both blocking and streaming responses
- ✅ Conversation Support: Persistent conversation history and context
- ✅ Function Calling: Register and execute custom functions (experimental)
- ✅ Error Handling: Comprehensive exception system with detailed error codes
- ✅ Memory Management: Efficient model lifecycle with cleanup
- ✅ Built on Official LEAP SDK: Uses Liquid AI's native SDK (v0.5.0)
- ✅ Vision Models Support: Process images with LFM2-VL models
- ✅ Secure Logging: Production-safe logging system with sensitive data protection
- Flutter SDK
- Android: Device with
arm64-v8aABI, minimum API level 31 - iOS: Device with iOS 15+, 64-bit architecture (iPhone 6s and newer)
- 3GB+ RAM recommended for model execution
Add this to your pubspec.yaml:
dependencies:
flutter_leap_sdk: ^0.2.4import 'package:flutter_leap_sdk/flutter_leap_sdk.dart';
// Download a model (using display name for convenience)
await FlutterLeapSdkService.downloadModel(
modelName: 'LFM2-350M', // Display name or full filename
onProgress: (progress) => print('Download: ${progress.percentage}%'),
);
// Load the model (supports display names)
await FlutterLeapSdkService.loadModel(
modelPath: 'LFM2-350M', // Will resolve to full filename automatically
);
// Generate response
String response = await FlutterLeapSdkService.generateResponse(
'Hello, AI!',
systemPrompt: 'You are a helpful assistant.',
);
print(response);
// Or use streaming for real-time responses
FlutterLeapSdkService.generateResponseStream('Hello, AI!').listen(
(chunk) => print('Chunk: $chunk'),
onDone: () => print('Generation complete'),
onError: (error) => print('Error: $error'),
);All models are downloaded from Hugging Face and cached locally:
| Model | Size | Description |
|---|---|---|
| LFM2-350M | 322 MB | Smallest model |
| LFM2-700M | 610 MB | Balanced model |
| LFM2-1.2B | 924 MB | Largest model |
| LFM2-VL-450M | 385 MB | Small vision model |
| LFM2-VL-1.6B | 1.19 GB | Vision model |
Note: Models are automatically downloaded to the app's documents directory under
/leap/folder.
Here's how you can download and load a custom model programmatically:
// To download a custom model
await FlutterLeapSdkService.downloadModel(
modelUrl: 'https://example.com/model.bundle',
modelName: 'my-custom-model',
);
// To load the custom model
await FlutterLeapSdkService.loadModel(modelPath: 'my-custom-model');import 'package:flutter/material.dart';
import 'package:flutter_leap_sdk/flutter_leap_sdk.dart';
class ChatScreen extends StatefulWidget {
@override
_ChatScreenState createState() => _ChatScreenState();
}
class _ChatScreenState extends State<ChatScreen> {
bool isModelLoaded = false;
String response = '';
@override
void initState() {
super.initState();
_initializeModel();
}
Future<void> _initializeModel() async {
try {
// Check if model exists, download if not
bool exists = await FlutterLeapSdkService.checkModelExists(
'LFM2-350M-8da4w_output_8da8w-seq_4096.bundle'
);
if (!exists) {
await FlutterLeapSdkService.downloadModel(
modelName: 'LFM2-350M', // Using display name for convenience
onProgress: (progress) {
print('Download progress: ${progress.percentage}%');
// Progress includes: bytesDownloaded, totalBytes, percentage
},
);
}
// Load the model with options
await FlutterLeapSdkService.loadModel(
modelPath: 'LFM2-350M',
options: ModelLoadingOptions(
randomSeed: 42,
cpuThreads: 4,
),
);
setState(() {
isModelLoaded = true;
});
} catch (e) {
print('Error initializing model: $e');
}
}
Future<void> _generateResponse(String message) async {
if (!isModelLoaded) return;
setState(() {
response = '';
});
try {
// Use streaming for real-time response
FlutterLeapSdkService.generateResponseStream(message).listen(
(chunk) {
setState(() {
response += chunk;
});
},
);
} catch (e) {
print('Error generating response: $e');
}
}
}// Create a persistent conversation
Conversation conversation = await FlutterLeapSdkService.createConversation(
systemPrompt: 'You are a helpful coding assistant.',
generationOptions: GenerationOptions(
temperature: 0.7,
maxTokens: 1000,
),
);
// Generate responses within conversation context
String response = await conversation.generateResponse('Explain async/await in Dart');
// Use streaming with conversation
conversation.generateResponseStream('What are futures?').listen(
(chunk) => print(chunk),
);
// Conversation automatically maintains history
print('History: ${conversation.history.length} messages');Work with images using LFM2-VL vision models:
// Load a vision model
await FlutterLeapSdkService.loadModel(
modelPath: 'LFM2-VL-1.6B (Vision)',
);
// Create conversation for vision tasks
Conversation visionChat = await FlutterLeapSdkService.createConversation(
systemPrompt: 'You are a helpful AI that can see and analyze images.',
);
// Analyze an image
import 'dart:io';
import 'dart:typed_data';
File imageFile = File('/path/to/image.jpg');
Uint8List imageBytes = await imageFile.readAsBytes();
String response = await visionChat.generateResponseWithImage(
'What do you see in this image?',
imageBytes,
);
print('Vision response: $response');try {
await FlutterLeapSdkService.loadModel(modelPath: 'nonexistent-model');
} on ModelLoadingException catch (e) {
print('Failed to load model: ${e.message} (${e.code})');
} on ModelNotLoadedException catch (e) {
print('Model not loaded: ${e.message}');
} on FlutterLeapSdkException catch (e) {
print('SDK error: ${e.message} (${e.code})');
}loadModel({String? modelPath, ModelLoadingOptions? options})- Load model with optionsunloadModel()- Unload current model and free memorycheckModelLoaded()- Check if model is loadedcheckModelExists(String modelName)- Check if model file existsgetDownloadedModels()- List all local modelsdeleteModel(String fileName)- Delete model filegetModelInfo(String fileName)- Get model metadata
generateResponse(String message, {String? systemPrompt, GenerationOptions? options})- Generate complete responsegenerateResponseStream(String message, {String? systemPrompt, GenerationOptions? options})- Streaming generationgenerateResponseWithImage(String message, Uint8List imageBytes, {String? systemPrompt, GenerationOptions? options})- Generate response with imagecancelStreaming()- Cancel active streaming
createConversation({String? systemPrompt, GenerationOptions? options})- Create conversationgetConversation(String id)- Get existing conversationdisposeConversation(String id)- Clean up conversation resources
downloadModel({String? modelUrl, String? modelName, Function(DownloadProgress)? onProgress})- Download with progresscancelDownload(String downloadId)- Cancel ongoing downloadgetActiveDownloads()- List active download IDs
This package is built on top of Liquid AI's official LEAP SDK. For more information about LEAP SDK and Liquid AI, visit leap.liquid.ai.
Contributions are welcome! Please feel free to submit Pull Requests or file issues.
This project is licensed under the MIT License - see the LICENSE file for details.