Corex is a powerful local WebUI for self-hosted AI, such as Text generation web UI and Open WebUI.
- Supports only one local text generation backend, including Ollama (more will be added later).
- 100% offline and private, with zero telemetry, external resources, or remote update requests.
- Seamless integration with Ollama for local model execution.
- Simple setup and configuration for beginners.
- Web Search: use web search for AI.
- Model selection: the ability to switch between different models directly in the interface.
Follow these steps to get corex-webui up and running:
# Clone repository
git clone https://github.com/hevnee/corex-webui
cd corex-webui
# Install dependencies
pip install -r requirements.txt --upgrade
# Launch server
python server.py
# You can also just run `server.py`
# Install random model
ollama pull llama3:8b
- Python 3.8 or higher
- Ollama installed and configured
- At least 8GB VRAM for running LLMs locally
We welcome contributions! To get started:
- Fork the repository
- Create a new branch for your bug or feature
- Commit your changes
- Push to the Branch
- Open a Pull Request