A modern AI application stack using Gemini, Neo4j, and LangChain.
- 🤖 AI-powered chatbot using Google's Gemini Pro
- 📚 RAG (Retrieval Augmented Generation) with Neo4j vector store
- 📄 PDF document processing and Q&A
- 🌐 REST API for integration
- 💻 Modern web interface
- Docker and Docker Compose
- Google Cloud API key with Gemini API enabled
- Neo4j database (included in Docker setup)
- Clone the repository:
git clone <your-repo-url>
cd genai-stack
- Create a
.env
file with your configuration:
LLM=gemini-pro
EMBEDDING_MODEL=google-genai-embedding-001
GOOGLE_API_KEY=your-google-api-key-here
- Start the stack:
docker-compose up
- Frontend: http://localhost:8505
- Bot UI: http://localhost:8501
- PDF Bot: http://localhost:8503
- API: http://localhost:8504
- Neo4j Browser: http://localhost:7474
The project uses Docker Compose for development with hot-reloading enabled. Each service can be developed independently:
front-end/
: React-based web interfacebot.py
: Main chatbot interfacepdf_bot.py
: PDF processing and Q&Aapi.py
: REST API endpointsloader.py
: Data loading utilities
Private - All rights reserved