Code Mentor Bot is a local-first AI assistant that explains code snippets (fully offline), with no API calls or cloud dependencies.
Powered by:
- 🧠 Local LLMs via Ollama
- 🖥️ Lightweight Streamlit UI
- 🧰 Supports models like
phi
,codellama
, andmistral
Live explanation of a function that adds two numbers using a local model
- ✍️ Paste any Python (or general-purpose) code
- 🤖 Get instant explanations using offline models
- 🎛️ Switch between multiple models (phi, mistral, codellama)
⚠️ Smart static checks (e.g., recursion without base case)- 🧪 Tested & CI-friendly with mocked LLM calls
⚠️ Python 3.10+ recommended
git clone https://github.com/zakkariyaa/mentor-bot.git
cd mentor-bot
python -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
Install Ollama: https://ollama.com/download Then run:
ollama pull phi
ollama pull codellama
ollama pull mistral
streamlit run app.py
💡 Example Input
def factorial(n):
return n * factorial(n - 1)
🧪 Run Tests
pytest -m "not slow" # Fast tests only
pytest # All tests, including slow LLM tests
✨ Why This Project? Built to demonstrate:
🔒 Offline-friendly AI toolin
🧠 LangChain-free LLM integration
🧑💻 Thoughtful UX for code understanding