Skip to content

ZukeLima/solar-impact-insights

Repository files navigation

🌞 Solar Impact Insights

Complete system for analysis and prediction of Solar Energetic Particle (SEP) events using Docker, PostgreSQL, and FastAPI.

πŸš€ Features

πŸ“Š Multiple Dashboards

  • Main Web Dashboard: Modern interface with interactive charts
  • Streamlit Analytics: Advanced analysis and data exploration
  • Grafana Monitoring: Real-time monitoring with alerts

πŸ”¬ Advanced Analytics

  • Data Collection: Integration with solar and atmospheric data APIs
  • Correlation Analysis: Statistical analysis between different variables
  • Clustering: Grouping of similar events with 3D visualizations
  • Anomaly Detection: Automatic identification of anomalous events
  • Predictions: Machine learning models for event prediction

🚨 Alert System

  • Real-time Alerts: Notifications for high-intensity events
  • Severity Classification: High, Medium, Low
  • Alert Dashboard: Dedicated interface for management

πŸ› οΈ Interface Technologies

  • REST API: Complete interface for system interaction
  • WebSockets: Real-time updates (planned)
  • Database: Persistent storage with PostgreSQL

πŸ› οΈ Technologies

  • Python 3.11
  • FastAPI - Modern web framework
  • PostgreSQL - Relational database
  • SQLAlchemy - Python ORM
  • Docker & Docker Compose - Containerization
  • Pandas & NumPy - Data analysis
  • Scikit-learn - Machine Learning
  • Matplotlib & Seaborn - Visualization
  • PgAdmin - Database administration interface

πŸ“‹ Prerequisites

  • Docker Desktop
  • Docker Compose
  • PowerShell (Windows)

πŸƒβ€β™‚οΈ How to Run

Option 1: PowerShell Script (Recommended)

# Start all services
.\scripts\start.ps1

# Stop all services
.\scripts\stop.ps1

Option 2: Manual Docker Compose

# Start services
docker-compose up --build -d

# Stop services
docker-compose down

# View logs
docker-compose logs -f app

Option 3: Local Streamlit

# Run only Streamlit locally
.\scripts\run_streamlit.ps1

Option 4: Demo Mode

# Run demo with sample data
.\scripts\demo.ps1

Option 5: Individual Services

# Database only
docker-compose up -d db

# FastAPI + Database
docker-compose up -d db app

# All except Grafana
docker-compose up -d db app streamlit pgadmin

🌐 Access Points

After starting the services:

πŸ—„οΈ Database

PostgreSQL Connection

  • Host: localhost
  • Port: 5432
  • Database: previsao_solar
  • User: postgres
  • Password: postgres123

Main Tables

  • sep_events - Solar energetic particle events
  • predictions - Model-generated predictions
  • alerts - System alerts
  • model_metrics - Model performance metrics

πŸ“Š API Endpoints

Data Collection

  • POST /data/collect - Collect and store data
  • GET /data/events - List stored events
  • GET /data/high-intensity - High-intensity events

Analytics

  • POST /analysis/correlations - Correlation analysis
  • POST /analysis/clustering - Event clustering
  • POST /analysis/prediction - Generate predictions

Monitoring

  • GET /alerts - List active alerts
  • GET /health - Application status

πŸ—οΈ Project Structure

solar-impact-insights/
β”œβ”€β”€ app/                    # Main application
β”‚   β”œβ”€β”€ main.py            # Entry point
β”‚   └── api.py             # FastAPI routes
β”œβ”€β”€ domain/                # Domain entities
β”‚   └── entities.py        # Data classes
β”œβ”€β”€ use_cases/            # Use cases
β”‚   β”œβ”€β”€ analysis.py       # Statistical analysis
β”‚   └── alerts.py         # Alert system
β”œβ”€β”€ infrastructure/       # Infrastructure
β”‚   β”œβ”€β”€ data_collection.py # Data collection
β”‚   β”œβ”€β”€ visualization.py   # Visualizations
β”‚   └── database/         # Database configuration
β”‚       β”œβ”€β”€ models.py     # SQLAlchemy models
β”‚       └── repository.py # Repositories
β”œβ”€β”€ adapters/             # Adapters
β”‚   └── data_adapter.py   # Data integration
β”œβ”€β”€ data/                 # Data files
β”‚   └── real_solar_data.csv # Solar event dataset
β”œβ”€β”€ sql/                  # SQL scripts
β”‚   └── init.sql          # Database initialization
β”œβ”€β”€ scripts/              # Utility scripts
β”‚   β”œβ”€β”€ start.ps1         # Start services
β”‚   β”œβ”€β”€ stop.ps1          # Stop services
β”‚   β”œβ”€β”€ run_streamlit.ps1 # Run Streamlit locally
β”‚   β”œβ”€β”€ populate_database.py # Database population
β”‚   β”œβ”€β”€ collect_real_data.py # Real data collection
β”‚   └── demo.ps1          # Demo script
β”œβ”€β”€ static/               # Static web assets
β”‚   β”œβ”€β”€ dashboard.css     # Dashboard styles
β”‚   └── dashboard.js      # Dashboard JavaScript
β”œβ”€β”€ templates/            # HTML templates
β”‚   └── dashboard.html    # Main dashboard template
β”œβ”€β”€ grafana/              # Grafana configuration
β”‚   └── dashboard.json    # Grafana dashboard config
β”œβ”€β”€ logs/                 # Application logs
β”œβ”€β”€ streamlit_app.py      # Streamlit application
β”œβ”€β”€ demo_api.py          # API demonstration script
β”œβ”€β”€ docker-compose.yml    # Docker orchestration
β”œβ”€β”€ Dockerfile           # Application image
β”œβ”€β”€ requirements.txt     # Python dependencies
β”œβ”€β”€ .env                 # Environment variables
β”œβ”€β”€ .gitignore           # Git ignore rules
└── README.md            # Project documentation

πŸ”§ Development

Local Environment

# Install dependencies
pip install -r requirements.txt

# Configure environment variables
cp .env.example .env

# Run database only
docker-compose up -d db

# Run local application
python app/main.py

# Run Streamlit locally
python streamlit_app.py

Data Management

# Populate database with sample data
python scripts/populate_database.py

# Collect real solar data
python scripts/collect_real_data.py

# Quick data collection
python scripts/quick_collect.py

Debugging

# View application logs
docker-compose logs -f app

# View database logs
docker-compose logs -f db

# Access application container
docker-compose exec app bash

# Access database
docker-compose exec db psql -U postgres -d previsao_solar

πŸ“ˆ Usage Example

  1. Start the services:

    .\scripts\start.ps1
  2. Access the dashboard: http://localhost:8000/dashboard

  3. Collect data:

    • Click "Collect Mock Data"
  4. Run analyses:

    • "Correlation Analysis"
    • "Clustering"
    • "Generate Predictions"
  5. Monitor:

    • "View Alerts"
    • "High-Intensity Events"

🚨 Alerts and Monitoring

The system has automatic alerts for:

  • High-intensity events (SEP > 5.0)
  • Data anomalies detected
  • Critical event predictions

πŸ“ Logs

Logs are stored in:

  • Container: /app/logs/
  • Local: ./logs/

🀝 Contributing

  1. Fork the project
  2. Create a feature branch
  3. Commit your changes
  4. Push to the branch
  5. Open a Pull Request

πŸ“„ License

This project is under the MIT license.

πŸ†˜ Support

For issues or questions:

  1. Check logs: docker-compose logs -f
  2. Restart services: .\scripts\stop.ps1 and .\scripts\start.ps1
  3. Verify all ports are available

πŸ”„ Updates

To update the system:

# Stop services
.\scripts\stop.ps1

# Rebuild
docker-compose build --no-cache

# Start
.\scripts\start.ps1

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published