Skip to content

NiiV3AU/OllamaTor

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

52 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Welcome to OllamaTor 👋

OllamaTor is a user-friendly desktop application that brings the power of Ollama's local Large Language Models (LLMs) to your fingertips. Chat with AI, adjust settings, and monitor system usage.

Screenshot of OllamaTor

Features

  • User-Friendly: Easy installation and intuitive interface.
  • Customizable: Adjust temperature and chat history length. Many AI models.
  • Resource Monitoring: Real-time CPU, RAM, and GPU (load + memory) usage monitor. + AI performance "TPM" (tokens-per-minute).
  • Privacy: Your data stays local. No cloud, no tracking.
  • Performance: Leverage the full power of your computer for fast responses.
  • Offline Availability: Works even without an internet connection.

TO-DO:

  • Stop Button to stop generating a response -> Available in v0.0.3
  • fixing the math rendering (KaTeX)
  • Customizable Copilots (inital prompt for the AI to get a better understanding on what it's working on)

Installation

Download Ollama.exe

Getting Started

  1. Select a Model: Choose a model from the dropdown.
  2. Chat: Type your prompt & click "Send".
  3. Settings: Use the gear icon to change temperature and history.
  4. Help: Use the help icon to start the step-by-step tour, download Ollama & get instructions to properly install models

Requirements

  • Windows 10/11 (other OS not tested)
  • Chrome or Edge (one of the two webbrowsers is required to run this application)
  • Ollama (can also be downloaded later in OllamaTor itself; internet connection is needed)
  • Downloaded Ollama models
  • Python* (*OPTIONAL: for source code execution only; not needed for the compiled .exe):
    • eel
    • requests
    • psutil
    • nvidia-ml-py

Contributing

Report bugs or suggest features via GitHub Issues. Pull requests welcome!

About

Chat offline and localy with many AI models. OllamaTor is a Python Eel based front-end for the Ollama API.

Topics

Resources

Stars

Watchers

Forks