Problem whit use ChatOllama #4934
Unanswered
MauroBarotto1989
asked this question in
Q&A
Replies: 1 comment
-
I solved it by completing the Chatflows configuration. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I started flowise 3.0.4 in Docker following the guide in the documentation. When I start a chat using "ChatOllama" and a local model, I don't receive a response. In the Docker desktop log, I see a retry every 60 seconds of the following call: "/api/v1/internal-prediction/"
If I close and reopen the chat I see the replies received from LLM and the continued requests
Could be a configuration problem?
Thanks
Beta Was this translation helpful? Give feedback.
All reactions