AI-powered fashion recommendation system leveraging LLMs, embeddings, and retrieval techniques to deliver personalized shopping experiences.
This project is a Retrieval-Augmented Generation (RAG) chatbot designed for fashion e-commerce. It provides personalized recommendations, answers product queries, and enhances user engagement using state-of-the-art LLMs and vector-based retrieval.
Built with FastAPI, FAISS, ChromaDB, LangChain, Ollama, and Streamlit, this system efficiently indexes a 30K-product fashion dataset and serves real-time recommendations.
✅ AI-Powered Fashion Recommendations – Get smart and personalized product suggestions.
✅ Hybrid Retrieval (FAISS + BM25 + ChromaDB) – Multi-modal search for better results.
✅ LLM-Driven Q&A – Handles customer queries with real-time responses.
✅ Cross-Encoder Reranking – Improves retrieval accuracy.
✅ Self-Querying Retriever – Converts queries into structured filters.
✅ Streamlit Chatbot UI – A modern, user-friendly interface.
✅ FastAPI Backend – A scalable API serving the recommender.
✅ Dockerized Deployment – Runs seamlessly in containers.
Category | Tools Used |
---|---|
Programming | Python 3.12 |
LLM Models | GPT-4o-mini , Llama 3.2:3B , Ollama |
Vector Search | FAISS , ChromaDB |
Retrieval & Ranking | BM25 , LangChain |
Backend | FastAPI , Pydantic , Loguru |
Frontend | Streamlit |
Deployment | Docker , Docker Compose |
Data Handling | Pandas , Numpy , Kaggle API |
- Python 3.12+
- Docker & Docker Compose
- Ollama installed on your machine
git clone https://github.com/amine-akrout/llm-based-recommender.git
cd llm-based-recommender
Copy the example .env
file and configure necessary credentials:
cp .env.example .env
Modify the .env
file to include your Kaggle API key, OpenAI API key, and other configurations.
docker-compose up --build
Or
Make docker-start
Make install-python # Install Python
Make install # Install dependencies
Make indexing # Index the dataset
Make retriever # Create the retriever
Make app # Start the FastAPI app
Make ui # Start the Streamlit UI
The FastAPI backend exposes multiple endpoints. After running the API, visit Swagger Docs:
🔗 Swagger UI: http://localhost:8000/docs
🔗 Redoc: http://localhost:8000/redoc
Method | Endpoint | Description |
---|---|---|
POST |
/recommend/ |
Get fashion product recommendations |
GET |
/health |
Check API health status |
🔗 Access the UI at: http://localhost:8501
The chatbot interface allows users to ask for product recommendations, filter results, and get AI-powered responses.
The recommender is built using a 30K-product e-commerce dataset indexed with FAISS, BM25, and ChromaDB.
- Download Dataset – Uses
Kaggle API
- Preprocess Data – Cleans and structures the dataset
- Generate Embeddings – Vectorizes product descriptions
- Store in FAISS & BM25 – Hybrid retrieval for fast search
The chatbot's recommendation process follows a structured LangGraph workflow:
- Check Topic: Determines if the query is relevant to fashion.
- Self-Query Retrieve: Extracts relevant product information.
- Ranker: If retrieval returns empty, BM25 & FAISS rank results.
- RAG Recommender: Uses LLM to generate the final recommendation.
📦 llm-based-recommender
├── 📂 src
│ ├── 📂 api # FastAPI Backend
│ ├── 📂 indexing # FAISS, BM25, Chroma Indexing
│ ├── 📂 retriever # Query Processing
│ ├── 📂 recommender # Core LLM-based Recommender
│ ├── 📂 ui # Streamlit Chatbot
│ ├── config.py # App Configuration
├── 📄 docker-compose.yml # Docker Services
├── 📄 Dockerfile # API Containerization
├── 📄 pyproject.toml # Project Dependencies
├── 📄 requirements.txt # Python Packages
├── 📄 .env.example # Environment Variables
🔹 Fine-tune LLM for better recommendations
🔹 Improve UI/UX with product images
🔹 Add multi-language support
🔹 Deploy to AWS/GCP
Contributions are welcome! If you’d like to contribute:
1️⃣ Fork the repo
2️⃣ Create a new branch
3️⃣ Commit your changes
4️⃣ Submit a pull request
For major changes, please open an issue first to discuss what you’d like to change.
If you find this project useful, don’t forget to ⭐ star the repository! 🚀✨
👤 Amine
💼 LinkedIn