Skip to content

A privacy-focused tutoring assistant that explains concepts step-by-step and generates quizzes (MCQs). Works 100% offline with Ollama (Gemma/Llama3).

License

Notifications You must be signed in to change notification settings

hari7261/AI-Tutor

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

3 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

๐ŸŽ“ AI Tutor - Local AI Study Buddy

Python Streamlit Ollama License

A powerful, privacy-focused AI tutoring application that runs entirely on your local machine. Get personalized explanations and generate custom quizzes across multiple subjects without sending any data to external servers.

AI Tutor Demo

โœจ Features

๐ŸŽฏ Personalized Learning

  • Multiple Education Levels: School, High School, Graduate, PG/PhD
  • Subject Variety: Math, History, Computer Science, Physics, Biology, Chemistry
  • Adaptive Explanations: Content complexity adjusts to your education level

๐Ÿค– Dual Learning Modes

  • Explain a Topic: Get detailed, step-by-step explanations with examples
  • Generate a Quiz: Create custom multiple-choice questions with explanations

๐Ÿ”’ 100% Privacy

  • Local Processing: All AI computations happen on your device
  • No Data Transfer: Your questions and conversations never leave your machine
  • Offline Capable: Works without internet once models are downloaded

๐Ÿง  Multiple AI Models

  • Gemma3: Google's latest model, optimized for educational content
  • DeepSeek Coder: Specialized for programming and computer science
  • Llama3: Meta's powerful general-purpose model
  • Auto-Detection: Automatically discovers installed Ollama models

๐Ÿš€ Quick Start

Prerequisites

  1. Python 3.7+ installed on your system
  2. Ollama installed and running (Download Ollama)

Installation

  1. Clone the repository

    git clone https://github.com/hari7261/AI-Tutor.git
    cd AI-Tutor
  2. Install dependencies

    pip install -r requirements.txt
  3. Install AI models (choose one or more)

    # Recommended: Gemma3 (best for general education)
    ollama pull gemma3
    
    # For coding and computer science
    ollama pull deepseek-coder
    
    # Alternative general-purpose model
    ollama pull llama3
  4. Start Ollama server (if not already running)

    ollama serve
  5. Run the application

    streamlit run app.py
  6. Open your browser and navigate to http://localhost:8501

๐ŸŽฎ How to Use

1. Configure Your Learning Preferences

  • Education Level: Select your current academic level
  • Subject: Choose the subject you want to study
  • Mode: Pick between explanation or quiz generation
  • AI Model: The app will automatically detect and list available models

2. Ask Questions or Request Topics

  • Explanation Mode: "Explain photosynthesis" or "How does machine learning work?"
  • Quiz Mode: "Create a quiz about World War 2" or "Test me on calculus derivatives"

3. Interactive Learning

  • Get detailed explanations with examples
  • Receive custom quizzes with immediate feedback
  • Build on previous conversations for deeper understanding

๐Ÿ“ Project Structure

AI-Tutor/
โ”œโ”€โ”€ app.py                 # Main Streamlit application
โ”œโ”€โ”€ requirements.txt       # Python dependencies
โ”œโ”€โ”€ README.md             # Project documentation
โ”œโ”€โ”€ LICENSE               # MIT License
โ”œโ”€โ”€ .gitignore           # Git ignore rules
โ”œโ”€โ”€ assets/              # Images and media
โ”‚   โ””โ”€โ”€ demo.gif         # Application demo
โ”œโ”€โ”€ docs/                # Additional documentation
โ”‚   โ”œโ”€โ”€ installation.md  # Detailed installation guide
โ”‚   โ”œโ”€โ”€ usage.md         # Usage examples and tips
โ”‚   โ””โ”€โ”€ troubleshooting.md # Common issues and solutions
โ”œโ”€โ”€ config/              # Configuration files
โ”‚   โ””โ”€โ”€ models.yaml      # Model configuration
โ””โ”€โ”€ tests/               # Test files
    โ””โ”€โ”€ test_app.py      # Unit tests

๐Ÿงฉ Core Modules

1. Model Detection (get_available_models())

  • Automatically discovers installed Ollama models
  • Handles different API response formats
  • Prioritizes models based on educational performance
  • Provides fallback options and error handling

2. Education Level Adaptation

  • Adjusts explanation complexity based on selected level
  • Customizes vocabulary and examples
  • Scales problem difficulty appropriately

3. Subject-Specific Prompting

  • Tailors AI responses to subject context
  • Incorporates subject-specific terminology
  • Provides relevant examples and analogies

4. Streaming Response Handler

  • Real-time response display for better user experience
  • Handles connection errors gracefully
  • Provides visual feedback during generation

5. Session Management

  • Maintains conversation history
  • Preserves context across interactions
  • Enables follow-up questions and clarifications

๐Ÿ”ง Configuration

Model Priority

The application prioritizes models in the following order:

  1. gemma3:latest - Best for general education
  2. deepseek-coder - Optimal for programming topics
  3. llama3 - Reliable general-purpose alternative

Custom Model Configuration

Edit config/models.yaml to customize model preferences:

models:
  preferred_order:
    - "gemma3:latest"
    - "deepseek-coder"
    - "llama3"
  
  subject_recommendations:
    "Computer Science": "deepseek-coder"
    "Math": "gemma3"
    "Physics": "gemma3"

๐Ÿ› ๏ธ Development

Setting up Development Environment

  1. Fork the repository

  2. Create a virtual environment

    python -m venv ai-tutor-env
    source ai-tutor-env/bin/activate  # On Windows: ai-tutor-env\Scripts\activate
  3. Install development dependencies

    pip install -r requirements-dev.txt
  4. Run tests

    pytest tests/

Code Style

  • Follow PEP 8 guidelines
  • Use type hints where appropriate
  • Add docstrings to functions and classes
  • Maintain test coverage above 80%

๐Ÿ› Troubleshooting

Common Issues

"No Ollama models found"

  • Ensure Ollama is running: ollama serve
  • Check installed models: ollama list
  • Install a model: ollama pull gemma3

Connection errors

  • Verify Ollama is accessible on default port (11434)
  • Check firewall settings
  • Restart Ollama service

Performance issues

  • Use smaller models for better speed
  • Ensure sufficient RAM (8GB+ recommended)
  • Close unnecessary applications

See docs/troubleshooting.md for detailed solutions.

๐Ÿค Contributing

We welcome contributions! Please see our Contributing Guidelines for details.

Ways to Contribute

  • ๐Ÿ› Report bugs and issues
  • ๐Ÿ’ก Suggest new features
  • ๐Ÿ“– Improve documentation
  • ๐Ÿงช Add test cases
  • ๐ŸŽจ Enhance UI/UX

๐Ÿ“Š Performance Metrics

Model Size Speed Education Quality
Gemma3 3.3GB Fast โญโญโญโญโญ
DeepSeek Coder 776MB Very Fast โญโญโญโญ (CS Topics)
Llama3 4.7GB Medium โญโญโญโญ

๐Ÿ—บ๏ธ Roadmap

  • Multi-language Support - Add support for multiple languages
  • Voice Integration - Voice-to-text and text-to-voice
  • Progress Tracking - Learning progress and analytics
  • Study Plans - Automated curriculum generation
  • Collaborative Learning - Share sessions with classmates
  • Mobile App - Native mobile applications

๐Ÿ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

๐Ÿ™ Acknowledgments

  • Ollama for providing the local AI infrastructure
  • Streamlit for the amazing web framework
  • Google for the Gemma model family
  • DeepSeek for the specialized coding model

๐Ÿ“ž Support


Made with โค๏ธ for learners everywhere

โญ Star this repo if you find it helpful!

About

A privacy-focused tutoring assistant that explains concepts step-by-step and generates quizzes (MCQs). Works 100% offline with Ollama (Gemma/Llama3).

Topics

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages