ConversationChain with customised prompt #18501
Replies: 2 comments 1 reply
-
If you're not tied to ConversationChain specifically, you can add memory to a chat model following the documentation here. The example showcased there includes two input variables. If you wanted to use ConversationBufferMemory or similar memory object, you could tweak the from langchain.memory import ConversationBufferMemory
def get_session_history(session_id: str) -> BaseChatMessageHistory:
if session_id not in store:
store[session_id] = ConversationBufferMemory()
return store[session_id].chat_memory Since you mentioned that one of the inputs is named "context", if you are interested in implementations of question answering or RAG with memory you can follow the documentation here. Note that in this example the user manages the memory by extending the from langchain_community.chat_message_histories import ChatMessageHistory
from langchain_core.chat_history import BaseChatMessageHistory
from langchain_core.runnables.history import RunnableWithMessageHistory
store = {}
def get_session_history(session_id: str) -> BaseChatMessageHistory:
if session_id not in store:
store[session_id] = ChatMessageHistory()
return store[session_id]
with_message_history = RunnableWithMessageHistory(
rag_chain,
get_session_history,
input_messages_key="question",
history_messages_key="chat_history",
) question = "What is Task Decomposition?"
print(question)
response = with_message_history.invoke(
{"question": question, "chat_history": chat_history},
config={"configurable": {"session_id": "abc123"}},
)
print(response.content + "\n\n")
second_question = "What are common ways of doing it?"
print(second_question)
response = with_message_history.invoke(
{"question": second_question, "chat_history": chat_history},
config={"configurable": {"session_id": "abc123"}},
)
print(response.content)
|
Beta Was this translation helpful? Give feedback.
-
Actually what i need is a rag system with following workflow. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I need to create a ConversationChain with the prompt template that contains two input variables input, context also with ConversationBufferMemory as memory. How to do that?
Beta Was this translation helpful? Give feedback.
All reactions