[LangGraph + Ollama] Agent using local model (qwen2.5) returns AIMessage(content='') even when tool responds correctly #30990
matiasdev30
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
[LangGraph + Ollama] Agent using local model (qwen2.5) returns
AIMessage(content='')
even when tool responds correctlyI’m using
create_react_agent
fromlanggraph.prebuilt
with a local model served via Ollama (qwen2.5
), and the agent consistently returns anAIMessage
with an emptycontent
field — even though the tool returns a valid string.Code
Output
As shown above, the agent responds with an empty string, even though the
search()
tool clearly returns"It's 60 degrees and foggy."
.Has anyone seen this behavior? Could it be an issue with
qwen2.5
,langgraph.prebuilt
, the Ollama config, or maybe a mismatch somewhere between them?Any insight appreciated.
``
Beta Was this translation helpful? Give feedback.
All reactions