Ollama LLM not actually calling the tool, but correctly identifying the need for tool #31530
Closed
RETR0-OS
announced in
Ask Dosu (Archived)
Replies: 1 comment 2 replies
-
This result is correct. You need to use a toolnode to execute this result. you could see https://github.com/langchain-ai/langchain-academy/blob/main/module-1/agent.ipynb |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I am trying to test the tool-calling functionality of langchain with the Ollama Chat model. I am using
llama3-groq-tool-use:latest
as the LLM.Here is my code:
I am getting the following output:
The LLM correctly identifies that it needs to call a tool but does not actually call it. How do I fix this issue?
Beta Was this translation helpful? Give feedback.
All reactions