Replies: 1 comment 2 replies
-
Hey @kurup-007! I'm here to help you with your code and answer any questions you have while waiting for a human maintainer. Let's work together to solve this issue! To ensure that the from typing import List
from langchain_ollama import ChatOllama
from typing_extensions import TypedDict
def validate_user(user_id: int, addresses: List) -> bool:
"""Validate user using historical addresses.
Args:
user_id: (int) the user ID.
addresses: Previous addresses.
"""
return True
llm = ChatOllama(
model="llama3-groq-tool-use",
temperature=0,
).bind_tools(
tools=[validate_user],
tool_choice={"type": "function", "function": {"name": "validate_user"}}
)
result = llm.invoke(
"Could you validate user 123? They previously lived at "
"123 Fake St in Boston MA and 234 Pretend Boulevard in "
"Houston TX."
)
print(result.tool_calls) In this example, the |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Checked other resources
Commit to Help
Example Code
Description
I am trying to migrate my code from OllamaFunction to ChatOllama but when I tried a sample case with a function but model doesn't call the tool and responds from it's knowledge and the result.tool_calls array is empty.
System Info
System Information
Package Information
Optional packages not installed
Other Dependencies
Beta Was this translation helpful? Give feedback.
All reactions