A Flutter-based voice assistant application built using the Vapi SDK for seamless voice interactions and raw websockets implementation for custom solutions. This project demonstrates the implementation of a voice-enabled personal assistant with real-time communication capabilities.
Simulator.-.iPhone.13.-.16.June.2025.mp4
Simulator.-.iPhone.13.-.16.June.2025.1.mp4
Simulator.-.iPhone.13.-.16.June.2025.2.mp4
Simulator.-.iPhone.13.-.16.June.2025.3.mp4
- π£οΈ Voice Interaction: Natural voice communication with AI assistant
- π Real-time Audio Processing:
- High-quality audio streaming
- Voice activity detection
- Background noise handling
- π οΈ Core Capabilities:
- Speech-to-Text (STT) conversion
- Text-to-Speech (TTS) synthesis
- Real-time websocket communication
- π± Mobile Integration:
- Native Flutter implementation
- iOS and Android support
- Permission handling
- Audio device management
- Flutter SDK β₯ 3.0.0
- Android:
- compileSdkVersion β₯ 33
- minSdkVersion β₯ 24
- NDK β₯ 25.1.8937393
- Vapi API credentials
Clone and set up the project:
git clone
cd vapi-personal-assistant
# Install dependencies
flutter pub get
Create a credentials.dart
file in the lib
directory:
const VAPI_API_KEY = 'Your VAPI API key';
const VAPI_ASSISTANT_ID = 'Your VAPI assistant ID';
const VAPI_PUBLIC_KEY = 'Your VAPI public key';
Add to your Info.plist
:
<key>NSMicrophoneUsageDescription</key>
<string>This app requires access to the microphone for voice communication.</string>
Update your Podfile
:
post_install do |installer|
installer.pods_project.targets.each do |target|
flutter_additional_ios_build_settings(target)
target.build_configurations.each do |config|
config.build_settings['GCC_PREPROCESSOR_DEFINITIONS'] ||= [
'$(inherited)',
'PERMISSION_MICROPHONE=1',
]
end
end
end
Add to your AndroidManifest.xml
:
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
flutter run
lib/
βββ main.dart # Application entry point
βββ credentials.dart # API credentials
βββ vapi/ # Vapi implementation
β βββ Vapi.dart # Main Vapi client
β βββ vapi_event.dart # Event handling
βββ screens/ # UI components
βββ vapi_screen.dart
βββ vapi_custom_screen.dart
backend/
βββ index.js # Express server setup
βββ routes/ # API endpoints
βββ models/ # Database models
β βββ Todo.js
β βββ reminder.js
β βββ calendar.js
βββ config/ # Configuration files
βββ utils/ # Helper functions
- Vapi Client: Handles communication with Vapi services
- Audio Management:
- Recording with
AudioRecorder
- Playback with
FlutterSoundPlayer
- Recording with
- WebSocket Communication: Real-time bidirectional data transfer
- Permission Handling: Microphone and audio settings management
The project includes a Node.js backend with Express that provides additional functionality through tool calling:
- SQLite database using Sequelize ORM
- Models for todos, reminders, and calendar events
- Automatic table creation and synchronization
The backend implements several tools that can be called by the voice assistant:
createTodo
: Create new todo itemsgetTodos
: Retrieve all todosdeleteTodo
: Remove specific todos
addReminder
: Set new reminders with importance levelsgetReminders
: List all remindersdeleteReminders
: Remove specific reminders
createEvent
: Schedule new calendar eventsgetEvents
: Retrieve all calendar eventsdeleteEvent
: Remove specific events
- RESTful endpoints with VAPI request validation
- Structured response format for tool calls
- Error handling and logging
- Request validation middleware
To run the backend:
cd backend
npm install
node index.js
The server will start on http://localhost:3000
and automatically set up the database.
NOTE: If you are running backend on localhost, you probably would need to setup ngrock to make Vapi tool calling work properly
The application provides two main interfaces:
- Web Call Screen: Standard voice interaction interface
- Support Call Screen: Customized voice interaction experience
This project is licensed under the MIT License - see the LICENSE.txt file for details.
Created by Oleksandr Samoilenko
Extrawest.com, 2025