A beautiful, private AI chat application built with Streamlit and powered by Ollama. Experience intelligent conversations with a clean, modern interface that keeps your data completely local and secure.
- 🔐 100% Private: All conversations stay on your local machine
- ⚡ Fast & Responsive: Real-time streaming responses
- 🎨 Beautiful UI: Clean, modern dark blue theme
- 🤖 Multiple Models: Support for various Ollama models
- 🎛️ Customizable: Adjustable temperature and model settings
- 📱 Responsive Design: Works great on all screen sizes
Before running CommonChat AI, make sure you have:
- Python 3.8+ installed
- Ollama installed and running
- At least one Ollama model downloaded
- Clone the repository
git clone https://github.com/hari7261/CommonChat-AI.git
cd CommonChat-AI
- Install dependencies
pip install -r requirements.txt
- Install and setup Ollama
# Install Ollama (visit https://ollama.ai for installation instructions)
# Download a model (e.g., gemma3)
ollama pull gemma3
- Run the application
streamlit run app.py
- Open your browser and navigate to
http://localhost:8501
The application supports various Ollama models:
gemma3
(default)llama3
mistral
codellama
Adjust the creativity of responses:
- 0.0-0.3: More focused and deterministic
- 0.4-0.7: Balanced (default: 0.7)
- 0.8-1.0: More creative and random
CommonChat-AI/
├── app.py # Main application file
├── requirements.txt # Python dependencies
├── README.md # Project documentation
├── LICENSE # MIT License
├── .gitignore # Git ignore rules
└── docs/ # Additional documentation
└── SETUP.md # Detailed setup guide
- Fork the repository
- Create a feature branch
git checkout -b feature/your-feature-name
- Make your changes
- Test thoroughly
- Commit and push
git commit -m "Add your feature description"
git push origin feature/your-feature-name
- Create a Pull Request
- Follow PEP 8 guidelines
- Use meaningful variable names
- Add comments for complex logic
- Keep functions small and focused
CommonChat AI is designed with privacy as a core principle:
- No Data Collection: We don't collect or store any personal data
- Local Processing: All AI processing happens on your machine
- No External Calls: Except for Ollama, no external services are used
- Open Source: Complete transparency with open-source code
We welcome contributions! Here's how you can help:
- Report Bugs: Use the issue tracker to report bugs
- Suggest Features: Propose new features via issues
- Submit PRs: Help improve the code
- Documentation: Help improve our docs
- Testing: Help test on different platforms
- @hari7261 - Creator & Maintainer
This project is licensed under the MIT License - see the LICENSE file for details.
- Ollama - For providing excellent local AI models
- Streamlit - For the amazing web app framework
- Inter Font - For the beautiful typography
Need help? Here's how to get support:
- GitHub Issues: For bugs and feature requests
- Discussions: For questions and community chat
- Email: [Your email if you want to provide it]
If you found this project helpful, please consider giving it a star!
Made with ❤️ by Hariom