This project is a simple yet powerful chatbot built with FastAPI and LangChain, integrating OpenAI's GPT-3.5-Turbo model. It supports session-based message memory, a customizable user interface with HTML/JS, dark mode, and persistent chat history using localStorage.
- ✅ Session-based memory using LangChain
- ✅ FastAPI-powered backend
- ✅ Lightweight HTML + JavaScript frontend
- ✅ Username prompt on first visit
- ✅ Persistent chat history (localStorage)
- ✅ Dark mode toggle
- ✅ Clear chat button
- Clone the repository:
git clone https://github.com/iremhazald/chatbot-memory-app.git
cd chatbot-memory-app
- Create a virtual environment (recommended):
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
- Install dependencies:
pip install -r requirements.txt
- Create a
.env
file and add your OpenAI API key:
OPENAI_API_KEY=your_openai_key_here
- Run the server:
uvicorn main:app --reload
- Open in your browser:
http://127.0.0.1:8000
- When the page loads, you’ll be prompted to enter your name.
- Type your message and click Send.
- Messages will appear in the chat box along with AI responses.
- You can toggle dark mode or clear chat history at any time.
├── main.py # FastAPI app and routes
├── chat_logic.py # LangChain logic for session memory
├── templates/
│ └── index.html # Frontend user interface
├── requirements.txt # Python dependencies
└── .env # Your OpenAI API key (not tracked)
- LangChain
- FastAPI
- OpenAI GPT-3.5 Turbo
- HTML + Vanilla JavaScript
- Python 3.9+
İrem Hazal Deveci
LinkedIn • GitHub)
This project is licensed under the MIT License. Feel free to use, modify, and share.