Ollama as a separate container #1761
Unanswered
mrepetto94
asked this question in
Q&A
Replies: 2 comments
-
hello could you find any solution. I have the same problem. |
Beta Was this translation helpful? Give feedback.
0 replies
-
Look at pull request #1812
…________________________________
Da: Alptekin Topal ***@***.***>
Inviato: Monday, April 1, 2024 10:23:58 AM
A: zylon-ai/private-gpt ***@***.***>
Cc: Marco Repetto ***@***.***>; Author ***@***.***>
Oggetto: Re: [zylon-ai/private-gpt] Ollama as a separate container (Discussion #1761)
hello could you find any solution. I have the same problem.
—
Reply to this email directly, view it on GitHub<#1761 (comment)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/A4PXYPMHOD5ZWDSTZXLQL5LY3EKR5AVCNFSM6AAAAABE5I3ICOVHI2DSMVQWIX3LMV43SRDJONRXK43TNFXW4Q3PNVWWK3TUHM4DSNZQGUZDO>.
You are receiving this because you authored the thread.Message ID: ***@***.***>
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Has anyone tried to run Ollama in a separate container with docker-compose?
Changing the
api_base
tohttp://ollama:11434
does not return anything aside from this warning:Beta Was this translation helpful? Give feedback.
All reactions