Replies: 2 comments 1 reply
-
🤖 Hey @fabiancpl! Good to see you diving into more LangChain intricacies. Your enthusiasm is infectious! 😄 To extend the
from datetime import datetime
from langchain_core.messages import AIMessage, HumanMessage
class ExtendedAIMessage(AIMessage):
def __init__(self, content: str, message_id: str, reply_id: str):
super().__init__(content)
self.timestamp = datetime.now()
self.message_id = message_id
self.reply_id = reply_id
class ExtendedHumanMessage(HumanMessage):
def __init__(self, content: str, message_id: str, reply_id: str):
super().__init__(content)
self.timestamp = datetime.now()
self.message_id = message_id
self.reply_id = reply_id
import json
from langchain_core.chat_history import BaseChatMessageHistory
class ExtendedBaseChatMessageHistory(BaseChatMessageHistory):
def add_message(self, message):
message_dict = message.__dict__
message_json = json.dumps(message_dict)
# Save message_json in Redis Please note that the above code is a simplified example and might need to be adjusted based on your specific use case and the structure of your Redis database. If you need more specific help, please provide the source code for the Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
-
Here's how I added a created_at timestamp for the AIMessage object that gets saved in our state and pulled out in the checkpointer in LangGraph to save timestamp to the AIMessage. (useful when needing to display timestamp messages on user interface) # Node
def llamapress(state: LlamaPressState):
additional_instructions = state.get("agent_prompt")
# System message
sys_msg = SystemMessage(content=f"""You are LlamaPress, a helpful AI assistant.
In normal chat conversations, feel free to implement markdown formatting to make your responses more readable, if it's appropriate.
Here are additional instructions provided by the user: <ADDITIONAL_STATE_AND_CONTEXT> {state} </ADDITIONAL_STATE_AND_CONTEXT>
<USER_INSTRUCTIONS> {additional_instructions} </USER_INSTRUCTIONS>""")
llm = ChatOpenAI(model="o4-mini")
llm_with_tools = llm.bind_tools(tools)
llm_response_message = llm_with_tools.invoke([sys_msg] + state["messages"])
llm_response_message.response_metadata['created_at'] = str(datetime.now())
return {"messages": [llm_response_message]} And here's how I put created_at timestamp in my response_metadata in my HumanMessage when I am creating my initial state object before invoking the node workflow: app = self.get_app_from_workflow_string(langgraph_workflow) #parses langgraph.json to get build_workflow
# Create messages from the message content
messages = [HumanMessage(content=message.get("message"), response_metadata={'created_at': datetime.now()})]
# Start with the transformed messages field
state = {"messages": messages}
return app, state |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi folks.
I am interested in extend the AIMessage and HumanMessage classes in order to include additional info like timestamp, message id, reply id, etc. All of this information must also to be saved into my Redis memory.
How can I extend this classes to include additional attributes to content and subsequently to use these custom classes into a chain?
Thanks.
Beta Was this translation helpful? Give feedback.
All reactions