A proof-of-concept application for natural language querying using language models and graph-based workflows. This project demonstrates how to process natural language queries, interact with knowledge bases, and provide results through a Streamlit web interface.
- Natural language query processing
- Integration with language models (e.g., OpenAI)
- Modular agent and tool architecture
- Streamlit-based user interface
- Easily extensible for new data sources and tools
nlq_lang_graph_poc_docker/
├── lang_graph_poc/ # Main application code
├── streamlit_apps/ # Streamlit UI apps
├── tests/ # Unit and integration tests
├── requirements/ # Dependency files
├── requirements.txt # Main requirements file
├── Dockerfile # Docker build file
├── docker-compose.yml # (Optional) Docker Compose file
├── README.md # Project documentation
└── ...
- Python 3.10+
- pip
- (Optional) Docker and Docker Compose
-
Clone the repository:
git clone <your-repo-url> cd nlq_lang_graph_poc_docker
-
Create and activate a virtual environment:
python3 -m venv venv source venv/bin/activate
-
Install dependencies:
pip install --upgrade pip pip install -r requirements.txt
-
Set up environment variables:
- Create a
.env
file in the project root and add any required API keys or settings, for example:OPENAI_API_KEY=your-key-here
- Create a
-
Run the Streamlit app:
streamlit run streamlit_apps/streamlit_app_0.py
- Visit http://localhost:8501 in your browser.
-
Build the Docker image:
docker build -t nlq-lang-graph-poc .
-
Run the container:
docker run --env-file .env -p 8501:8501 nlq-lang-graph-poc
- The app will be available at http://localhost:8501.
-
(Optional) Using Docker Compose:
- If you have a
docker-compose.yml
:docker-compose up --build
- If you have a
pytest
- Test with Steps upto Generate & Validate SQL
- Test With All Steps, Generate SQL >> Verify SQL >> Clarification >> Exectute >> Process >> Summarisation