NeaLLM is a local LLM chat assistant supporting Ollama and LM Studio. Enjoy a fast, modern, and private chat UI — your AI assistant runs fully on your own device, with no cloud or data sharing.
- ⚡ Lightning Fast: Instant local responses, no cloud, always super fast.
- 🔒 100% Private: All data stays on your device—nothing is sent to the cloud.
- 🤖 Smart Assistant: Seamlessly switch between Ollama and LM Studio LLMs.
-
Clone the repository:
git clone https://github.com/NeaDigitra/NeaLLM.git cd NeaLLM
-
Install dependencies:
npm install # or yarn
-
Run the app:
npm run dev # or yarn dev
-
Open in your browser:
http://localhost:5173
- Ollama: Make sure Ollama is running locally.
- LM Studio: Or, run LM Studio and point NeaLLM to the local API.
- Configuration can be managed from the app settings (gear icon).
MIT © NeaDigitra