Chat history/memory #5504
-
Does Llama.cpp have some built-in way to handle chat history in a way that the model can refer back to information from previous messages? Without simply sending the chat history as part of the prompt, I mean. In a similar way ChatGPT seems to be able to. I've tried using the |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments
-
I did not manage to get this work. But it seems like the go-to way to implement an LLM memory is to use something like described in the Langchain documentation. And this article on pineconde.io I found. TLDR
I hope this helps anyone else that was confused about this. |
Beta Was this translation helpful? Give feedback.
-
Another alternate answer is to use Zep which works with any inference SDK, framework, or language. https://www.unremarkable.ai/llama-chat-history-with-zeps-ai-memory-knowledge-graph/ |
Beta Was this translation helpful? Give feedback.
I did not manage to get this work. But it seems like the go-to way to implement an LLM memory is to use something like described in the Langchain documentation. And this article on pineconde.io I found.
TLDR
k
messages along with the promptI hope this helps anyone else that was confused about this.