Skip to content

hevnee/corex-webui

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

34 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Corex WebUI

Corex Logo

A minimalistic WebUI for self-hosted AI.

Discord Ollama Python Github license Github last commit

Overview

Corex is a powerful local WebUI for self-hosted AI, such as Text generation web UI and Open WebUI.

Corex example gif

Features

  • Supports only one local text generation backend, including Ollama (more will be added later).
  • 100% offline and private, with zero telemetry, external resources, or remote update requests.
  • Seamless integration with Ollama for local model execution.
  • Simple setup and configuration for beginners.
  • Web Search: use web search for AI.
  • Model selection: the ability to switch between different models directly in the interface.

Installation

Follow these steps to get corex-webui up and running:

# Clone repository
git clone https://github.com/hevnee/corex-webui
cd corex-webui

# Install dependencies
pip install -r requirements.txt --upgrade

# Launch server
python server.py
# You can also just run `server.py`

# Install random model
ollama pull llama3:8b

Requirements:

  • Python 3.8 or higher
  • Ollama installed and configured
  • At least 8GB VRAM for running LLMs locally

Contributing

We welcome contributions! To get started:

  1. Fork the repository
  2. Create a new branch for your bug or feature
  3. Commit your changes
  4. Push to the Branch
  5. Open a Pull Request

License

Copyright © 2025 hevnee.
Corex is MIT licensed.