GeoGraph Guardian is an advanced supply chain risk monitoring system that combines graph analytics with real-time geopolitical data and AI to transform supply chain risk management.
GeoGraph Guardian enables:
- Real-time risk prediction by analyzing network patterns and weather events
- Natural language queries for complex supply chain scenarios
- GPU-accelerated graph analytics to identify vulnerable nodes and alternative paths
- Hybrid query execution combining ArangoDB's AQL for path analysis and cuGraph for complex network metrics
- Interactive visualizations showing risk propagation through the supply chain network
- Automated mitigation recommendations based on historical patterns and network structure
- Graph-powered Supply Chain Analytics: Visualize and analyze your entire supply chain as a graph network.
- Natural Language Interface: Ask complex questions about your supply chain in plain English.
- Weather Impact Analysis: Monitor and assess weather-related risks to your supply chain.
- AI-Driven Insights: Receive AI-generated explanations and recommendations for risk mitigation.
- GPU Acceleration: Process large supply chain networks with NVIDIA GPU acceleration.
GeoGraph Guardian is built with a sophisticated tech stack:
- Data Integration: Combines supply chain networks with geopolitical event data
- Graph Processing: Uses NetworkX for data transformation and ArangoDB for persistent storage
- GPU Acceleration: Leverages NVIDIA's cuGraph for complex network algorithms like community detection and centrality analysis
- AI Agent Framework: Built with LangGraph and LangChain for natural language processing and response generation
- Hybrid Query System: Dynamically routes queries between AQL and cuGraph
- Python 3.10+
- NVIDIA GPU with CUDA 12.0+ (for GPU acceleration)
- ArangoDB 3.11+
-
Configure ArangoDB connection by updating the
config/arangodb.yaml
file:development: host: 127.0.0.1 port: 8529 database: geograph username: root password: your_password graph_name: supplychain
-
Ensure the collections configuration in the same file matches your data model:
collections: vertices: - name: suppliers - name: parts - name: products # Other vertex collections... edges: - name: supplier_provides_part - name: part_depends_on # Other edge collections...
-
Clone the repository:
git clone https://github.com/ramcovasu/geograph-guardian.git cd geograph-guardian
-
Run the main installation script:
bash install.sh
This script will:
- Check and install CUDA if needed
- Set up a Conda environment
- Install RAPIDS and other dependencies
- Install ArangoDB
-
Alternatively, you can install dependencies manually:
pip install -r requirements.txt
-
Run additional setup if needed:
bash setup_additional.sh
This handles starting ArangoDB and installing additional packages.
Note: The installation scripts (
install.sh
,rapids.sh
,setup_additional.sh
) can be moved into the project directory structure for better organization. Create aninstall
directory within the project and move these files there.
For GPU acceleration with NVIDIA GPUs:
bash rapids.sh
To enable the weather impact analysis feature:
- Get an API key from OpenWeatherMap
- Install the weather module:
bash install_weather_module.sh
- Add your API key to the
.env
file:OPENWEATHER_API_KEY=your_api_key_here
After organizing the installation files, your project structure should look like this:
geograph-guardian/
├── app.py # Main Streamlit application
├── config/
│ └── arangodb.yaml # Database configuration
├── data/
│ ├── cache/ # Cache for weather data
│ ├── processed/ # Processed data files
│ ├── raw/ # Raw data files
│ └── reference/ # Reference data
├── install/ # Installation scripts
│ ├── install.sh # Main installation script
│ ├── rapids.sh # RAPIDS installation
│ ├── Release.key # ArangoDB release key
│ └── setup_additional.sh # Additional setup steps
├── logs/ # Application logs
├── requirements.txt # Python dependencies
├── run_streamlit.sh # Script to run the Streamlit app
└── src/ # Source code
├── data_processing/ # Data processing modules
├── graph_analytics/ # Graph analytics algorithms
├── llm/ # LLM integration
├── scripts/ # Utility scripts
├── streamlit/ # Streamlit UI components
├── utils/ # Utility functions
├── visualization/ # Visualization modules
└── weather/ # Weather analysis module
To initialize the database and load sample data:
# Process and ingest data
python src/scripts/process_data.py
python src/scripts/ingest_data.py
# Verify data ingestion
python src/scripts/validate_data.py
Alternatively, use the all-in-one script:
bash run.sh
Start the application using the provided script:
bash run_streamlit.sh
Or manually with:
streamlit run app.py
The application will be available at http://localhost:8501 with three main interfaces:
- Chat Assistant: Ask natural language questions about your supply chain
- Graph Analytics: Perform advanced graph analytics such as community detection, centrality analysis, and shortest path analysis
- Weather Impact Analysis: Monitor real-time weather impacts on your supply chain
Here are some example questions you can ask the chat assistant:
- "Can you get 7 suppliers and their risk scores?"
- "Can you give me the name of the suppliers and their risk scores who can supply parts similar to the suppliers in JAPAN?"
- "Show me suppliers who have had delayed purchase orders and their current risk scores"
- "Show parts with HIGH criticality level where current inventory is below safety stock, include the supplier's risk score and lead time"
- "List all warehouse locations where product part LID001 has quantity on hand greater than safety stock?"
- "Show me all parts that had negative inventory transactions (issues/stockouts) along with their primary suppliers' risk scores and current inventory levels"
- "Show me all suppliers who provide parts with HIGH criticality that are primary suppliers and have risk scores above .75"
- "Can you give me names of alternate supplier to POWERCELL along with their risk scores and names of parts they can supply?"
- CPU: 4 cores
- RAM: 8GB
- Storage: 10GB
- OS: Ubuntu 22.04, Windows 10+, or macOS 12+
- NVIDIA GPU with 8GB+ VRAM
- CUDA Toolkit 12.0+
- 16GB+ System RAM
- OpenWeatherMap API for real-time weather data
- NVIDIA RAPIDS team for GPU-accelerated data science libraries
- ArangoDB team for the graph database system