Skip to content

zakkariyaa/mentor-bot

Repository files navigation

🤖 Code Mentor Bot

Code Mentor Bot is a local-first AI assistant that explains code snippets (fully offline), with no API calls or cloud dependencies.

Powered by:

  • 🧠 Local LLMs via Ollama
  • 🖥️ Lightweight Streamlit UI
  • 🧰 Supports models like phi, codellama, and mistral

demo-gif
Live explanation of a function that adds two numbers using a local model


🚀 Features

  • ✍️ Paste any Python (or general-purpose) code
  • 🤖 Get instant explanations using offline models
  • 🎛️ Switch between multiple models (phi, mistral, codellama)
  • ⚠️ Smart static checks (e.g., recursion without base case)
  • 🧪 Tested & CI-friendly with mocked LLM calls

🛠️ How to Run

⚠️ Python 3.10+ recommended

1. Clone and set up environment

git clone https://github.com/zakkariyaa/mentor-bot.git
cd mentor-bot
python -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt

2. Clone and set up environment

Install Ollama: https://ollama.com/download Then run:

ollama pull phi
ollama pull codellama
ollama pull mistral

3. Run the app

streamlit run app.py

💡 Example Input

def factorial(n):
    return n * factorial(n - 1)

🧪 Run Tests

pytest -m "not slow"        # Fast tests only
pytest                      # All tests, including slow LLM tests

Tests

✨ Why This Project? Built to demonstrate:

🔒 Offline-friendly AI toolin
🧠 LangChain-free LLM integration
🧑‍💻 Thoughtful UX for code understanding

About

Offline code explainer bot using Streamlit and Ollama-powered LLMs like phi and codellama.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages