#
llm-frontend
Here are 3 public repositories matching this topic...
React-based LLM frontend implementing CRAG & ToolCalls
-
Updated
Apr 11, 2025 - TypeScript
Lightweight web UI for llama.cpp with dynamic model switching, chat history & markdown support. No GPU required. Perfect for local AI development.
javascript sqlite self-hosted developer-tools web-interface python-flask chat-ui ai-assistant conversation-history model-switching cpu-inference performance-optimized llama-cpp local-ai llm-frontend open-source-ai offline-ai private-ai markdown-chat
-
Updated
Jun 23, 2025 - Shell
Improve this page
Add a description, image, and links to the llm-frontend topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the llm-frontend topic, visit your repo's landing page and select "manage topics."