Replies: 2 comments
-
me too |
Beta Was this translation helpful? Give feedback.
0 replies
-
You need to replace Ollama with Openal in the code. For this:
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I have an existing OpenAI API Connection "http://model:8000/v1" which is the default chat model in my open-webui.
I would like to use this same "http://model:8000/v1" LLM inside the pipeline to e.g. do nl2sql. Maybe I missed it but I can only see ollama and other "externally defined" LLM's.
How can I use the open-webui defined OpenAI API URL and use it in my pipeline?
Beta Was this translation helpful? Give feedback.
All reactions