-
Checked other resources
Commit to Help
Example Codefrom langgraph.prebuilt import create_react_agent
from langchain_mistralai import ChatMistralAI
from langchain_core.tools import tool
api_key = "xxxx"
llm = ChatMistralAI(temperature=0,api_key=api_key)
@tool
def get_weather(city = "niaho") -> str:
""" get the weather of a city"""
return f"It's sunny in {city}"
tools = [get_weather]
agent = create_react_agent(model=llm, tools=tools)
input_data = {"messages": [("human", "What's the weather like in New York?")]}
result = agent.invoke(input_data)
for message in result["messages"]:
print(message.content) DescriptionAt first, I also encountered this problem when using the tutorial. I tested it with a small case study, but it still didn't work. The free API I used has the smallest model limit. Could this be the reason? ---output--- System InfoSystem Information
Package Information
Optional packages not installed
Other Dependencies
|
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Oh, I solve this problem, It is indeed due to the Mistral model,small model does not support tool function calling and I use the ChatTongyi , it can use the tool correctly. |
Beta Was this translation helpful? Give feedback.
Oh, I solve this problem, It is indeed due to the Mistral model,small model does not support tool function calling and I use the ChatTongyi , it can use the tool correctly.