A standalone browser extension for intelligent data extraction and form filling using AI technology. Now with Ollama local model support!
π Live Demo & Documentation | π¦ Download Latest Release | π View Documentation | π₯ Watch Demo Video
Smart Form Filler analyzing and auto-filling a complex web form
π‘ Tip: The demo showcases all major features including AI-powered form filling, data extraction
- Data Extraction: Extract structured data from web pages
- Smart Form Filling: AI-powered automatic form completion with intelligent field mapping
- Multi-format Output: Raw HTML, cleaned HTML, and Markdown formats
- Chat with Data: Interactive Q&A with extracted content
- Local AI Models: Full Ollama integration for privacy-focused AI
- Cloud AI Models: Support for GPT-4o, o series, DeepSeek, and other cloud providers
- Backend Configuration: Built-in settings interface for configuring backend connections
- Service Status Monitoring: Real-time backend connection status and error handling
- Intelligent Field Analysis: Enhanced field descriptions including available options for dropdowns, radio buttons, and checkboxes
- Browser Integration: Works seamlessly with Chrome and other Chromium-based browsers
smart-form-filler/
βββ backend/ # Backend API server
β βββ controllers/ # API controllers
β βββ services/ # Business logic services
β βββ routes/ # API routes
β βββ config/ # Configuration files
β βββ server.js # Main server file
βββ extension/ # Browser extension
β βββ src/ # Extension source code
β βββ manifest.json # Extension manifest
β βββ popup.html # Extension popup UI
βββ package.json # Root package configuration
The Smart Form Filler extension has been refactored into a clean, modular architecture to improve maintainability and code organization. All files are properly sized for optimal maintainability.
π― Main Entry Point:
popup-main.js
- Lightweight entry point that initializes the modular system
π¦ Core Manager Modules:
popupManager.js
- Main coordinator orchestrating all popup functionalitypopupInitializer.js
- Handles DOM element initialization and validationpopupEventHandlers.js
- Manages all user interactions and UI eventspopupModelManager.js
- AI model loading, selection, and managementpopupSettingsManager.js
- Backend configuration and settings persistence
π§ Feature Modules:
formFillerHandler.js
- Form detection and filling functionalityformAnalysisService.js
- Form content analysis and mappinguiController.js
- UI state management and visual feedbackresultsHandler.js
- Results display and data managementchatHandler.js
- Chat interface and AI interactionsdataExtractor.js
- Page content extraction from web pagesapiClient.js
- Backend API communicationauthManager.js
- User authentication handling
β
Maintainability: Each module has a single responsibility
β
Modular Design: Proper file sizes for easy navigation and maintenance
β
Testability: Modules can be tested independently
β
Extensibility: New features can be added as separate modules
β
Debugging: Clear separation of concerns makes troubleshooting easier
popup-main.js
βββ PopupManager
βββ PopupInitializer (DOM setup)
βββ PopupSettingsManager (backend config)
βββ PopupModelManager (AI models)
βββ PopupEventHandlers (user interactions)
βββ Feature Modules
βββ FormFillerHandler
βββ UIController
βββ ResultsHandler
βββ ChatHandler
βββ DataExtractor
- Node.js (v14 or higher)
- npm or yarn
# Install dependencies
npm run install:all
# Start development server
npm run dev
- Open Chrome and navigate to
chrome://extensions/
(edge://extensions/
for Edge browsers) - Enable "Developer mode"
- Click "Load unpacked" and select the
extension
folder - The extension should now appear in your browser toolbar
Experience all extension features with our live interactive demo at:
π https://hddevteam.github.io/smart-form-filler/
The demo includes:
- π½οΈ Restaurant Feedback Form: Complete with realistic scenarios
- π Data Extraction: Interactive profile extraction demo
- π¬ AI Chat: Chat with extracted data functionality
- π― Prompt Examples: Specific scenarios like birthday celebrations, business dinners, family meals
π½οΈ Satisfied Customer:
"Fill this restaurant feedback form as John Smith (john.smith@techcorp.com) who just had dinner at Mario's Italian Restaurant. Give a 5-star rating and positive detailed comments about the seafood pasta and excellent service."
π Birthday Celebration:
"Fill this feedback as someone who celebrated their birthday here. Mention the surprise dessert, decorations, and how the staff made the evening special."
πΌ Business Lunch:
"Complete this form as a business professional who brought clients here. Focus on the quiet atmosphere, prompt service, and quality food that impressed the clients."
cd backend
npm run dev
The backend server will start on http://localhost:3001
GET /api/extension/health
- Health checkGET /api/extension/models
- Available AI modelsPOST /api/extension/extract-data-sources
- Extract page dataPOST /api/extension/chat-with-data
- Chat with extracted dataPOST /api/form-filler/analyze-form-relevance
- Analyze form relevancePOST /api/form-filler/analyze-field-mapping
- Generate field mappings
The extension includes a built-in settings interface for configuring the backend connection:
- Open Settings: Click the βοΈ settings button in the extension header
- Configure Backend URL: Enter your backend server URL (default:
http://localhost:3001
) - Test Connection: Click "Test" to verify the connection
- Save Settings: Click "Save" to apply the new configuration
- Persistent Storage: Settings are saved across browser sessions
- Connection Testing: Real-time validation of backend connectivity
- Error Handling: Clear feedback for connection issues
- Auto-reload: Models automatically refresh when backend changes
Backend URL: http://localhost:3001
Copy .env.example
to .env
and configure your environment variables:
cd backend
cp .env.example .env
For local AI model support, add to your .env
file:
OLLAMA_URL=http://localhost:11434
This extension supports Ollama local models for privacy-focused AI! Use your own locally-hosted models alongside cloud providers.
Visit https://ollama.ai/ and install Ollama for your platform.
ollama serve
# Recommended models for form filling and data extraction
ollama pull llama2
ollama pull mistral
ollama pull codellama
ollama pull qwen2.5:7b
ollama pull deepseek-r1
- Open the extension popup
- Click the π refresh button next to "AI Model"
- Select from Local Models (Ollama) or Cloud Models
- Enjoy private, local AI processing!
- Auto-Discovery: Automatically detects running Ollama servers
- Model Hot-Loading: Refresh model list without restarting
- Unified Interface: Seamless switching between local and cloud models
- Privacy-First: Data never leaves your machine with local models
Model | Use Case | Command |
---|---|---|
llama2 |
General tasks | ollama pull llama2 |
mistral |
High performance | ollama pull mistral |
codellama |
Code understanding | ollama pull codellama |
qwen2.5:7b |
Multilingual | ollama pull qwen2.5:7b |
deepseek-r1 |
Reasoning tasks | ollama pull deepseek-r1 |
Service Unavailable Message
If you see "
- Check Backend Server: Ensure the backend is running on the configured URL
- Verify URL: Click βοΈ settings and verify the backend URL is correct
- Test Connection: Use the "Test" button in settings to verify connectivity
- Check Network: Ensure no firewall or network issues blocking the connection
Models Not Loading
- Backend Status: Verify backend server is running (
npm run dev
) - URL Configuration: Check settings for correct backend URL
- Refresh Models: Click the π refresh button
- Check Logs: Look at browser console for specific error messages
Models Not Showing?
- Check Ollama Status:
curl http://localhost:11434/api/tags
- List Models:
ollama list
- Restart Ollama:
ollama serve
- Refresh Extension: Click π button
Connection Issues?
- Ensure Ollama runs on
http://localhost:11434
- Check firewall settings
- Update
OLLAMA_URL
if using custom port
# Run tests
npm test
# Build extension for production
npm run build:extension
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature
) - Commit your changes (
git commit -m 'Add some amazing feature'
) - Push to the branch (
git push origin feature/amazing-feature
) - Open a Pull Request
This project is licensed under the ISC License.