[LangGraph + Ollama] Agent using local model (qwen2.5) returns AIMessage(content='') even when tool responds correctly #30988
Replies: 1 comment
-
The issue you're encountering, where the In your case, it seems that the tool's response is not being directly set as the Additionally, ensure that the integration between the If the issue persists, consider reviewing the implementation details of the To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
[LangGraph + Ollama] Agent using local model (qwen2.5) returns
AIMessage(content='')
even when tool responds correctlyI’m using
create_react_agent
fromlanggraph.prebuilt
with a local model served via Ollama (qwen2.5
), and the agent consistently returns anAIMessage
with an emptycontent
field — even though the tool returns a valid string.Code
Output
As shown above, the agent responds with an empty string, even though the
search()
tool clearly returns"It's 60 degrees and foggy."
.Has anyone seen this behavior? Could it be an issue with
qwen2.5
,langgraph.prebuilt
, the Ollama config, or maybe a mismatch somewhere between them?Any insight appreciated.
``
Beta Was this translation helpful? Give feedback.
All reactions