[LangGraph + Ollama] Agent using local model (qwen2.5) returns AIMessage(content='') even when tool responds correctly #30991
Replies: 1 comment
-
Hello @matiasdev30! I'm Dosu, a bot here to help you solve bugs, answer questions, and assist with becoming a contributor. I'm ready to support you while you wait for a human maintainer. How can I assist you today? The issue you're encountering, where the Here are a few things you might want to check or try:
If these steps don't resolve the issue, consider testing with a different model or tool to isolate whether the problem is specific to the To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
[LangGraph + Ollama] Agent using local model (qwen2.5) returns
AIMessage(content='')
even when tool responds correctlyI’m using
create_react_agent
fromlanggraph.prebuilt
with a local model served via Ollama (qwen2.5
), and the agent consistently returns anAIMessage
with an emptycontent
field — even though the tool returns a valid string.Code
Output
As shown above, the agent responds with an empty string, even though the
search()
tool clearly returns"It's 60 degrees and foggy."
.Has anyone seen this behavior? Could it be an issue with
qwen2.5
,langgraph.prebuilt
, the Ollama config, or maybe a mismatch somewhere between them?Any insight appreciated.
``
Beta Was this translation helpful? Give feedback.
All reactions