Replies: 2 comments 15 replies
-
Hey there, @suribe06! I'm here to help you out with any coding conundrums you may have. I can assist with bug squashing, answer your burning questions, and even guide you on the path to becoming a contributor. Let's tackle this together! To prevent the LLM from overwriting the parameters passed to a LangChain tool in a custom agent, you can use the Here is an example of how to do this: from typing import List
from langchain_core.tools import InjectedToolArg, tool
from typing_extensions import Annotated
user_to_pets = {}
@tool(parse_docstring=True)
def update_favorite_pets(
pets: List[str], user_id: Annotated[str, InjectedToolArg]
) -> None:
"""Add the list of favorite pets.
Args:
pets: List of favorite pets to set.
user_id: User's ID.
"""
user_to_pets[user_id] = pets
@tool(parse_docstring=True)
def delete_favorite_pets(user_id: Annotated[str, InjectedToolArg]) -> None:
"""Delete the list of favorite pets.
Args:
user_id: User's ID.
"""
if user_id in user_to_pets:
del user_to_pets[user_id]
@tool(parse_docstring=True)
def list_favorite_pets(user_id: Annotated[str, InjectedToolArg]) -> None:
"""List favorite pets if any.
Args:
user_id: User's ID.
"""
return user_to_pets.get(user_id, []) When you invoke the tool, you need to pass in the user_id = "123"
update_favorite_pets.invoke({"pets": ["lizard", "dog"], "user_id": user_id})
print(user_to_pets)
print(list_favorite_pets.invoke({"user_id": user_id})) When the model calls the tool, no from copy import deepcopy
from langchain_core.runnables import chain
@chain
def inject_user_id(ai_msg):
tool_calls = []
for tool_call in ai_msg.tool_calls:
tool_call_copy = deepcopy(tool_call)
tool_call_copy["args"]["user_id"] = user_id
tool_calls.append(tool_call_copy)
return tool_calls
inject_user_id.invoke(ai_msg) Finally, chain together your model, injection code, and the actual tools to create a tool-executing chain: tool_map = {tool.name: tool for tool in tools}
@chain
def tool_router(tool_call):
return tool_map[tool_call["name"]]
chain = llm_with_tools | inject_user_id | tool_router.map()
chain.invoke("my favorite animals are cats and parrots") This approach ensures that the |
Beta Was this translation helpful? Give feedback.
-
@dosu
But the |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Checked other resources
Commit to Help
Example Code
Description
I'm working on a custom LangChain agent that uses a tool to make API calls based on parameters passed in the code. However, I'm encountering an issue where the LLM is generating its own parameters instead of using the ones I pass. The problem is that the tool is not using the parameters I provide, but instead, the LLM seems to be generating them. Here’s the output:
As you can see, the api_name has changed to movieDatabase, even though I passed trakt.tv. I want the tool to use the parameters I define in the code, not those generated by the LLM.
There a way to ensure that the parameters I pass are respected?
Any advice or suggestions would be greatly appreciated.
System Info
langchain
langchain-community
langchain-core
langchain-experimental
langchain-huggingface
langchain-openai
langchain-text-splitters
langcodes
langsmith
Beta Was this translation helpful? Give feedback.
All reactions