ChatOpenAI AgentExecutor not accessing tools when using LLMs hosted on server #30143
Unanswered
Sreeni1204
asked this question in
Q&A
Replies: 1 comment
-
It's a bit confusing for me. You're using ChatOpenAI, which is designed for OpenAI models, but you're passing Llama-3.3-70B-Instruct. This mismatch is likely causing the issue since LLaMA models don’t natively support OpenAI-style tool calling. To fix this:
Also, ensure that you're using |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Checked other resources
Commit to Help
Example Code
Description
With the above example code, I Am getting the following issue:
With normal interaction I am able to get the response from the agent, so there is not connectivity issue with the private server. But the AgentExecutor is not interacting with the tools. The same tools are accessed when I change the model with gpt4. I know the model hosted on the private server has the ability to interact with tools. Also, while I enabled the debug option, I can see below response which looks weird for finish_reason and model_name
Could you please support.
System Info
python 3.10
langchain==0.3.20
langchain-community==0.3.19
langchain-core==0.3.41
langchain-ollama==0.2.3
langchain-openai==0.3.7
langchain-text-splitters==0.3.6
Beta Was this translation helpful? Give feedback.
All reactions