Replies: 2 comments 12 replies
-
Hello, @bojan2501! I'm here to help you with your issue. I'm a bot that can assist you with bug fixes, answer your questions, and guide you on how to become a contributor. Let's work together on resolving your concern. To format messages from
def format_chat_history(chat_history):
formatted_history = ""
for message in chat_history.messages:
if isinstance(message, HumanMessage):
role = "user"
elif isinstance(message, AIMessage):
role = "assistant"
else:
role = "unknown"
formatted_history += f"<|start_header_id|>{role}<|end_header_id|>\n{message.content}\n<|eot_id|>\n"
return formatted_history
from langchain_community.chat_message_histories import StreamlitChatMessageHistory
from langchain_core.prompts import PromptTemplate
from langchain_core.runnables.history import RunnableWithMessageHistory
from langchain_core.messages import HumanMessage, AIMessage
# Define your prompt template
prompt_data = """
<|begin_of_text|>
<|start_header_id|>system<|end_header_id|>
{system}
<|eot_id|>
{chat_history}
<|start_header_id|>user<|end_header_id|>
{input}
<|eot_id|>
<|start_header_id|>assistant<|end_header_id|>
"""
prompt_template = PromptTemplate(template=prompt_data)
# Initialize the message history
history = StreamlitChatMessageHistory(key="chat_messages")
# Define your LLM (SagemakerEndpoint in this case)
llm = SagemakerEndpoint(...)
# Create the chain
chain = prompt_template | llm
# Wrap the chain with RunnableWithMessageHistory
wrapped_chain = RunnableWithMessageHistory(
chain,
lambda session_id: history,
history_messages_key="chat_history"
)
# Format the chat history and invoke the chain
formatted_history = format_chat_history(history)
response = wrapped_chain.invoke(
{"input": "What can you do?", "system": "You are a good assistant.", "chat_history": formatted_history},
config
)
print(response) This example demonstrates how to format the chat history and integrate it into your chain using For more detailed examples and guides, you can refer to the LangChain documentation on message history and the LangChain tutorial on building a chatbot [1][2][3]. |
Beta Was this translation helpful? Give feedback.
-
@dosu |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Checked other resources
Commit to Help
Example Code
Description
I am trying to use custom prompt format with RunnableWithMessageHistory.
For History I am using StreamlitChatMessageHistory.
And I cant find any info how to format messages from History for custom prompt.
Model in Sagemaker is Llama3.1 Instruct. Therefore custom prompt is needed.
This is prompt just before sending to LLM(Sagemaker):
<|begin_of_text|>
<|start_header_id|>system<|end_header_id|>
You are a good assistant.
<|eot_id|>
[AIMessage(content='How can I help you?')]
<|start_header_id|>user<|end_header_id|>
What can you do?
<|eot_id|>
<|start_header_id|>assistant<|end_header_id|>
And [AIMessage(content='How can I help you?')] needs to be formatted in specific way, like prompt template is set.
Are there any resources with examples how to format output from History in RunnableWithMessageHistory?
All best
Bojan
System Info
langchain==0.2.14
langchain-aws==0.1.17
langchain-community==0.2.12
langchain-core==0.2.34
streamlit==1.37.1
Beta Was this translation helpful? Give feedback.
All reactions