Welcome to the Local LLM-based Retrieval-Augmented Generation (RAG) System! This repository provides the full code to build a private, offline RAG system for managing and querying personal documents locally using a combination of OpenSearch, Sentence Transformers, and Large Language Models (LLMs). Perfect for anyone seeking a privacy-friendly solution to manage documents without relying on cloud services.
- Privacy-Friendly Document Search: Search through personal documents without uploading them to the cloud.
- Hybrid Search with OpenSearch: Uses both traditional text matching and semantic search.
- Easy Integration with LLMs: Leverage local LLMs for personalized, context-aware responses.
- Clone the repo:
git clone https://github.com/JAMwithAI/build_your_local_RAG_system.git
- Install dependencies:
pip install -r requirements.txt
- Configure
constants.py
for embedding models and OpenSearch settings. - Run the Streamlit app:
streamlit run welcome.py
For a detailed walkthrough of the setup and code, check out our blog:
Build a Local LLM-based RAG System for Your Personal Documents - Part 1
Build a Local LLM-based RAG System for Your Personal Documents - Part 2: The Guide
Enjoy your journey in building a private, AI-driven document management system! If you find this project useful, consider sharing it with others in the community!