How to properly use ConversationalRetrievalChain.from_llm
#20813
uzumakinaruto19
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi, So I have a rag application/chatbot, uses conversationalretrivalqa chain from Langchain, say if for questions like 'Hi' and all retrieval is happening, and its returning random documents How do I make the llm answer directly without retrieval for questions like this.? And one more thing how do I implement a memory(longterm will be better) with conversationalretrivalqa.from_llm chain..whatever I tried is not working, I tried with the Runnablehistory but that screws up the retrieval Does anyone have any workaround on that.? Any help will be appreciated ,thanks
`
chain = ConversationalRetrievalChain.from_llm(
Beta Was this translation helpful? Give feedback.
All reactions