Replies: 5 comments 12 replies
-
The warning you're encountering suggests there's an issue with how runs are linked, especially in the context of managing chat message histories. Here's a focused approach to address this:
Ensure that the configurations and identifiers are consistently used and correctly passed to each run to avoid such warnings. If the issue persists, providing more details about how you're using
|
Beta Was this translation helpful? Give feedback.
-
@ging-dev |
Beta Was this translation helpful? Give feedback.
-
Has anyone figured out what is causing this warning or how to resolve it? Adding the type hint didn't work, and I'm not sure how this could fix it. |
Beta Was this translation helpful? Give feedback.
-
Hello, I'm also seeing the same error messages. In my case, I'm not using chains but rather using a AzureChatOpenAI client and calling the 'invoke' method myself. Everything worked just fine until I added a second AzureChatOpenAI client (the first one had tool bindings but for some calls I want to run without tools) which is when I started getting these errors. I tried to generate my own session_id and set it in both clients, in the app when running and in the invoke calls but this doesn't seem to help. How do I make this work with two different clients making calls? Here is my client init code: run_config = RunnableConfig(
run_name = run_id,
configurable = {
"session_id": run_id
}
)
model = AzureChatOpenAI(
azure_deployment=os.getenv("AZURE_MODEL"),
api_version="2024-03-01-preview",
azure_endpoint=os.getenv("AZURE_OPENAI_ENDPOINT"),
max_retries=15,
temperature=0).with_config(
{
'callbacks': [ConsoleCallbackHandler()],
'configurable': {
'session_id': run_id
}
})
model_with_tools = AzureChatOpenAI(
azure_deployment=os.getenv("AZURE_MODEL"),
api_version="2024-03-01-preview",
azure_endpoint=os.getenv("AZURE_OPENAI_ENDPOINT"),
max_retries=15,
temperature=0).with_config(
{
'callbacks': [ConsoleCallbackHandler()],
'configurable': {
'session_id': run_id
}
}).bind_tools(log_tools) And an invoke example: response = model_with_tools.invoke(messages, run_config)
response = model.invoke(messages, run_config) The app run: for event in app.stream(
{
"messages": [],
"history": []
},
config={
"configurable": {
"thread_id": 42,
"session_id": run_id
},
"recursion_limit": 1000,
"run_name": run_id
}
):
pass Thanks! |
Beta Was this translation helpful? Give feedback.
-
Hello, I am also seeing the same error. I did not run into this issue for the first few times, and then it started showing up. Now every single run gives me this error. Session history function
My prompt:
Defining the chain
invoking the chain
Using Llama3.1 70B-instruct model. Its strange that i wasnt getting this error for the initial few ones, but now i get it, even if i change the session ID every single run. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Checked other resources
Commit to Help
Example Code
Description
Got warning:
Parent run ba1dc312-a394-4d5c-a5eb-75a1bafabc5c not found for run e438a895-7be4-4224-a4da-fb155abb0203. Treating as a root run.
when usingRunnableWithMessageHistory
withStreamingStdOutCallbackHandler
.System Info
System Information
Package Information
Beta Was this translation helpful? Give feedback.
All reactions