Replies: 5 comments 4 replies
-
Hey @sfitts, great to see you diving into another challenge! I remember our last interaction; let's see how we can tackle this one together. Based on the code and error message you've provided, it seems like there's a mismatch between the expected key for the input messages in In the To resolve this issue, you can specify the correct key when instantiating with_message_history = RunnableWithMessageHistory(
ChatOpenAI(),
get_session_history,
input_messages_key="prompts",
) In this adjusted code, However, please note that this solution assumes that "prompts" is the correct key for your input messages. If the key is different or if there are other issues in your code, you might need to adjust the solution accordingly. For more information on how I hope this helps! If you have any other questions or if something is not clear, please let me know. Sources
|
Beta Was this translation helpful? Give feedback.
-
Hi @sfitts, You should define the input key and then pass a dict in the from dotenv import load_dotenv
from langchain_community.chat_message_histories import ChatMessageHistory
from langchain_core.chat_history import BaseChatMessageHistory
from langchain_core.messages import HumanMessage
from langchain_core.runnables import RunnableParallel
from langchain_core.runnables.history import RunnableWithMessageHistory
from langchain_openai import ChatOpenAI
store = {}
load_dotenv()
def get_session_history(session_id: str) -> BaseChatMessageHistory:
if session_id not in store:
store[session_id] = ChatMessageHistory()
return store[session_id]
with_message_history = RunnableWithMessageHistory(
ChatOpenAI(),
get_session_history,
input_messages_key="question",
)
history = get_session_history("abc123")
print (history.messages)
result = with_message_history.invoke(
{"question":"What did Simone de Beauvoir believe about free will"},
config={"configurable": {"session_id": "abc123"}},
)
print (result)
print (history.messages) |
Beta Was this translation helpful? Give feedback.
-
Hello @maximeperrindev, I tried your solution, but it's throwing this exception: [] |
Beta Was this translation helpful? Give feedback.
-
@ArionIII -- after not getting much in the way of traction here we ended up creating a wrapper class to fix the issue (turns out you have to adjust behavior for both inputs and outputs). Here is that class -- https://gist.github.com/sfitts/3f71efa772348f407a68dcae445de1fb. Hope it helps. |
Beta Was this translation helpful? Give feedback.
-
I faced same error |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Checked other resources
Commit to Help
Example Code
Description
Based on the documentation found here https://python.langchain.com/docs/expression_language/how_to/message_history#messages-input-messages-output, I believe that the example code should correctly submit the prompt and add the prompt and subsequent result to the history. However, when the code is run I get:
Looking at the code in
_exit_history
(history.py
, line 422) I can see that is expecting theinputs
from run to be adict
with the key "input". However, at that point the key is "prompts" (presumably set by the OpenAI code, I haven't dug into that yet). The only way to change the key is it looking for would be to setinput_messages_key
, but doing that would require that the initial input is adict
as well, which I don't want.This would seem to be either a documentation issue (though I don't see a way to make it work at all), a misunderstanding on my part (though I did get the other options given to work) or a bug in either
RunnableWithHistory
orChatOpenAI
. Just not sure which ;).TL;DR; is there a way to make message history work when the input and output are a
BaseMessage
?System Info
System Information
Package Information
Packages not installed (Not Necessarily a Problem)
The following packages were not found:
Beta Was this translation helpful? Give feedback.
All reactions