Improvement in GUI settings for OpenAI compatible (Ollama) LLM API #27890
osering
started this conversation in
Feature Ideas / Enhancements
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Proposal adding "api_url" and "model (name)" fields under "Ollama" subtitle under Assistant Configuration. In this case it could let in gui configure other than 127.0.0.1 host (like usually needed 0.0.0.0) and other than 11434 port, therefore letting use not only Ollama, but also TextGen WebUI and other local backends, which use different ports and paths, but more or less common OpenAI compatible API.
Beta Was this translation helpful? Give feedback.
All reactions