Langchain chatbot with chroma vector storage memory #3954
ossirytk
started this conversation in
Show and tell
Replies: 1 comment 9 replies
-
I've added support for json lorebooks and metadata filtering. It's a bit hacky currently, but I'll see about improving the filtering when Chroma supports more complicated actions. |
Beta Was this translation helpful? Give feedback.
9 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
This is an upgrade to my previous chatbot. It adds a vector storage memory using ChromaDB. The main chatbot is built using llama-cpp-python, langchain and chainlit. It supports json, yaml, V2 and Tavern character card formats.
This version uses langchain llamacpp embeddings to parse documents into chroma vector storage collections. There is also a test script to query and test the collections Everything is local and in python. Runs in a virtual env with minimum installations.
The purpose of this project is to give out a more fleshed out character based chatbot as a foundation for testing and prototyping chroma features.
https://github.com/ossirytk/llama-cpp-chat-memory
Beta Was this translation helpful? Give feedback.
All reactions