Ollama Support? #7
Closed
mrjones2014
started this conversation in
Ideas
Replies: 3 comments 7 replies
-
Even better would be to use the Open-WebUI API directly, as that project adds some features like chat history etc. |
Beta Was this translation helpful? Give feedback.
0 replies
-
Great timing. I only came across Ollama a couple of days ago. Looks like integration would be pretty straight forward given the API structure but yep, will need adapters. The code base is verrrrryyyyyyyy tightly coupled to OpenAI. |
Beta Was this translation helpful? Give feedback.
2 replies
-
Hey @olimorris I'm trying this out and I'm getting this error:
|
Beta Was this translation helpful? Give feedback.
5 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I've recently been using Ollama, which basically provides a unified REST API interface to any open source LLM model, with Open-WebUI, which basically gives a completely offline local ChatGPT-like experience.
Any chance we could make like an API adapter or something to make this plugin work with a local Ollama API?
Beta Was this translation helpful? Give feedback.
All reactions