A powerful, privacy-focused AI tutoring application that runs entirely on your local machine. Get personalized explanations and generate custom quizzes across multiple subjects without sending any data to external servers.
- Multiple Education Levels: School, High School, Graduate, PG/PhD
- Subject Variety: Math, History, Computer Science, Physics, Biology, Chemistry
- Adaptive Explanations: Content complexity adjusts to your education level
- Explain a Topic: Get detailed, step-by-step explanations with examples
- Generate a Quiz: Create custom multiple-choice questions with explanations
- Local Processing: All AI computations happen on your device
- No Data Transfer: Your questions and conversations never leave your machine
- Offline Capable: Works without internet once models are downloaded
- Gemma3: Google's latest model, optimized for educational content
- DeepSeek Coder: Specialized for programming and computer science
- Llama3: Meta's powerful general-purpose model
- Auto-Detection: Automatically discovers installed Ollama models
- Python 3.7+ installed on your system
- Ollama installed and running (Download Ollama)
-
Clone the repository
git clone https://github.com/hari7261/AI-Tutor.git cd AI-Tutor
-
Install dependencies
pip install -r requirements.txt
-
Install AI models (choose one or more)
# Recommended: Gemma3 (best for general education) ollama pull gemma3 # For coding and computer science ollama pull deepseek-coder # Alternative general-purpose model ollama pull llama3
-
Start Ollama server (if not already running)
ollama serve
-
Run the application
streamlit run app.py
-
Open your browser and navigate to
http://localhost:8501
- Education Level: Select your current academic level
- Subject: Choose the subject you want to study
- Mode: Pick between explanation or quiz generation
- AI Model: The app will automatically detect and list available models
- Explanation Mode: "Explain photosynthesis" or "How does machine learning work?"
- Quiz Mode: "Create a quiz about World War 2" or "Test me on calculus derivatives"
- Get detailed explanations with examples
- Receive custom quizzes with immediate feedback
- Build on previous conversations for deeper understanding
AI-Tutor/
โโโ app.py # Main Streamlit application
โโโ requirements.txt # Python dependencies
โโโ README.md # Project documentation
โโโ LICENSE # MIT License
โโโ .gitignore # Git ignore rules
โโโ assets/ # Images and media
โ โโโ demo.gif # Application demo
โโโ docs/ # Additional documentation
โ โโโ installation.md # Detailed installation guide
โ โโโ usage.md # Usage examples and tips
โ โโโ troubleshooting.md # Common issues and solutions
โโโ config/ # Configuration files
โ โโโ models.yaml # Model configuration
โโโ tests/ # Test files
โโโ test_app.py # Unit tests
- Automatically discovers installed Ollama models
- Handles different API response formats
- Prioritizes models based on educational performance
- Provides fallback options and error handling
- Adjusts explanation complexity based on selected level
- Customizes vocabulary and examples
- Scales problem difficulty appropriately
- Tailors AI responses to subject context
- Incorporates subject-specific terminology
- Provides relevant examples and analogies
- Real-time response display for better user experience
- Handles connection errors gracefully
- Provides visual feedback during generation
- Maintains conversation history
- Preserves context across interactions
- Enables follow-up questions and clarifications
The application prioritizes models in the following order:
gemma3:latest
- Best for general educationdeepseek-coder
- Optimal for programming topicsllama3
- Reliable general-purpose alternative
Edit config/models.yaml
to customize model preferences:
models:
preferred_order:
- "gemma3:latest"
- "deepseek-coder"
- "llama3"
subject_recommendations:
"Computer Science": "deepseek-coder"
"Math": "gemma3"
"Physics": "gemma3"
-
Fork the repository
-
Create a virtual environment
python -m venv ai-tutor-env source ai-tutor-env/bin/activate # On Windows: ai-tutor-env\Scripts\activate
-
Install development dependencies
pip install -r requirements-dev.txt
-
Run tests
pytest tests/
- Follow PEP 8 guidelines
- Use type hints where appropriate
- Add docstrings to functions and classes
- Maintain test coverage above 80%
"No Ollama models found"
- Ensure Ollama is running:
ollama serve
- Check installed models:
ollama list
- Install a model:
ollama pull gemma3
Connection errors
- Verify Ollama is accessible on default port (11434)
- Check firewall settings
- Restart Ollama service
Performance issues
- Use smaller models for better speed
- Ensure sufficient RAM (8GB+ recommended)
- Close unnecessary applications
See docs/troubleshooting.md for detailed solutions.
We welcome contributions! Please see our Contributing Guidelines for details.
- ๐ Report bugs and issues
- ๐ก Suggest new features
- ๐ Improve documentation
- ๐งช Add test cases
- ๐จ Enhance UI/UX
Model | Size | Speed | Education Quality |
---|---|---|---|
Gemma3 | 3.3GB | Fast | โญโญโญโญโญ |
DeepSeek Coder | 776MB | Very Fast | โญโญโญโญ (CS Topics) |
Llama3 | 4.7GB | Medium | โญโญโญโญ |
- Multi-language Support - Add support for multiple languages
- Voice Integration - Voice-to-text and text-to-voice
- Progress Tracking - Learning progress and analytics
- Study Plans - Automated curriculum generation
- Collaborative Learning - Share sessions with classmates
- Mobile App - Native mobile applications
This project is licensed under the MIT License - see the LICENSE file for details.
- Ollama for providing the local AI infrastructure
- Streamlit for the amazing web framework
- Google for the Gemma model family
- DeepSeek for the specialized coding model
- ๐ Issues: GitHub Issues
- ๐ฌ Discussions: GitHub Discussions
- ๐ง Email: Contact Us
Made with โค๏ธ for learners everywhere
โญ Star this repo if you find it helpful!