Replies: 1 comment 11 replies
-
OLLAMA_API_BASE_URL is the right env var. Are you using patch() to create your client? If so, it should pick up the env var here |
Beta Was this translation helpful? Give feedback.
11 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I might fails to understand, but how to make the cpu/astra-assistant-api to connect to ollama:11434 instead of localhost:11434?
In a glance, I'm looking for a drop-in replacement of the OpenAI Assistant, find astra-assistant-api. I wanna to use a model from Ollama, but fail in this step.
Beta Was this translation helpful? Give feedback.
All reactions