The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but 100% free.
-
Updated
May 27, 2025 - TypeScript
The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but 100% free.
Free large language model (LLM) support for Neovim, provides commands to interact with LLM (like ChatGPT, ChatGLM, kimi, deepseek, openrouter and local llms). Support Github models.
A single-file tkinter-based Ollama GUI project with no external dependencies.
Ollama负载均衡服务器 | 一款高性能、易配置的开源负载均衡服务器,优化Ollama负载。它能够帮助您提高应用程序的可用性和响应速度,同时确保系统资源的有效利用。
TalkNexus: Ollama Chatbot Multi-Model & RAG Interface
Simple html ollama chatbot that is easy to install. Simply copy the html file on your computer and run it.
Chat with your pdf using your local LLM, OLLAMA client.(incomplete)
A modern web interface for [Ollama](https://ollama.ai/), with DeepSeek in next version.
Ollama with Let's Encrypt Using Docker Compose
PuPu is a lightweight tool that makes it easy to run AI models on your own device. Designed for smooth performance and ease of use, PuPu is perfect for anyone who wants quick access to AI without technical complexity.
Ollama Web UI is a simple yet powerful web-based interface for interacting with large language models. It offers chat history, voice commands, voice output, model download and management, conversation saving, terminal access, multi-model chat, and more—all in one streamlined platform.
Streamlit Chatbot using Ollama Open Source LLMs
AI model deployment on Synology NAS and macOS 🧠🐳
"A simple and lightweight client-server program for interfacing with local LLMs using ollama, and LLMs in groq using groq api."
ollama client for android
A web interface for Ollama, providing a user-friendly way to interact with local language models.
A modern, feature-rich web interface built with Next.js and shadcn/ui for interacting with local Ollama large language models.
A lightweight local AI chatbot powered by Ollama and LLMs. Built using Python sockets and multi-threading to handle multiple users at once. Designed for simple, friendly English conversations with emoji-rich replies. 🌟
This a simple but functional chat UI for ollama. Can easily add it into any web app to add floating chat UI with ollama resposnse in your web application.
Simple SwiftUI app for chatting with Ollama backend
Add a description, image, and links to the ollama-chat topic page so that developers can more easily learn about it.
To associate your repository with the ollama-chat topic, visit your repo's landing page and select "manage topics."