This project demonstrates a conversational AI chatbot built using LangChain and OpenAI's API, enhanced with two types of memory:
๐ง ConversationBufferMemory: preserves the raw message history.
๐ง ConversationSummaryMemory: maintains a dynamic summary of the conversation.
These memories are combined using CombinedMemory to give the LLM a richer context: raw user interactions and a concise abstract. The chatbot responds with improved coherence, continuity, and awareness of long-term context.
๐ฌ OpenAI GPT-based chatbot (ChatOpenAI)
๐ Multi-memory support (CombinedMemory)
๐ Prompt engineering with dynamic input injection
๐งช Interactive examples in Jupyter Notebook (.ipynb)
โ Modular, beginner-friendly, and production-ready
ConversationBufferMemory helps the LLM remember the latest turns verbatim.
ConversationSummaryMemory uses a language model to summarize the conversation progressively.
CombinedMemory feeds both to the prompt template to enable deep reasoning.