Complete system for analysis and prediction of Solar Energetic Particle (SEP) events using Docker, PostgreSQL, and FastAPI.
- Main Web Dashboard: Modern interface with interactive charts
- Streamlit Analytics: Advanced analysis and data exploration
- Grafana Monitoring: Real-time monitoring with alerts
- Data Collection: Integration with solar and atmospheric data APIs
- Correlation Analysis: Statistical analysis between different variables
- Clustering: Grouping of similar events with 3D visualizations
- Anomaly Detection: Automatic identification of anomalous events
- Predictions: Machine learning models for event prediction
- Real-time Alerts: Notifications for high-intensity events
- Severity Classification: High, Medium, Low
- Alert Dashboard: Dedicated interface for management
- REST API: Complete interface for system interaction
- WebSockets: Real-time updates (planned)
- Database: Persistent storage with PostgreSQL
- Python 3.11
- FastAPI - Modern web framework
- PostgreSQL - Relational database
- SQLAlchemy - Python ORM
- Docker & Docker Compose - Containerization
- Pandas & NumPy - Data analysis
- Scikit-learn - Machine Learning
- Matplotlib & Seaborn - Visualization
- PgAdmin - Database administration interface
- Docker Desktop
- Docker Compose
- PowerShell (Windows)
# Start all services
.\scripts\start.ps1
# Stop all services
.\scripts\stop.ps1
# Start services
docker-compose up --build -d
# Stop services
docker-compose down
# View logs
docker-compose logs -f app
# Run only Streamlit locally
.\scripts\run_streamlit.ps1
# Run demo with sample data
.\scripts\demo.ps1
# Database only
docker-compose up -d db
# FastAPI + Database
docker-compose up -d db app
# All except Grafana
docker-compose up -d db app streamlit pgadmin
After starting the services:
- ποΈ Main Dashboard: http://localhost:8000/dashboard
- π Streamlit Analytics: http://localhost:8501
- π Grafana Monitoring: http://localhost:3000
- User:
admin
- Password:
admin123
- User:
- π API Documentation: http://localhost:8000/docs
- π API Redoc: http://localhost:8000/redoc
- ποΈ PgAdmin: http://localhost:5050
- Email:
admin@previsao.com
- Password:
admin123
- Email:
- Host: localhost
- Port: 5432
- Database: previsao_solar
- User: postgres
- Password: postgres123
sep_events
- Solar energetic particle eventspredictions
- Model-generated predictionsalerts
- System alertsmodel_metrics
- Model performance metrics
POST /data/collect
- Collect and store dataGET /data/events
- List stored eventsGET /data/high-intensity
- High-intensity events
POST /analysis/correlations
- Correlation analysisPOST /analysis/clustering
- Event clusteringPOST /analysis/prediction
- Generate predictions
GET /alerts
- List active alertsGET /health
- Application status
solar-impact-insights/
βββ app/ # Main application
β βββ main.py # Entry point
β βββ api.py # FastAPI routes
βββ domain/ # Domain entities
β βββ entities.py # Data classes
βββ use_cases/ # Use cases
β βββ analysis.py # Statistical analysis
β βββ alerts.py # Alert system
βββ infrastructure/ # Infrastructure
β βββ data_collection.py # Data collection
β βββ visualization.py # Visualizations
β βββ database/ # Database configuration
β βββ models.py # SQLAlchemy models
β βββ repository.py # Repositories
βββ adapters/ # Adapters
β βββ data_adapter.py # Data integration
βββ data/ # Data files
β βββ real_solar_data.csv # Solar event dataset
βββ sql/ # SQL scripts
β βββ init.sql # Database initialization
βββ scripts/ # Utility scripts
β βββ start.ps1 # Start services
β βββ stop.ps1 # Stop services
β βββ run_streamlit.ps1 # Run Streamlit locally
β βββ populate_database.py # Database population
β βββ collect_real_data.py # Real data collection
β βββ demo.ps1 # Demo script
βββ static/ # Static web assets
β βββ dashboard.css # Dashboard styles
β βββ dashboard.js # Dashboard JavaScript
βββ templates/ # HTML templates
β βββ dashboard.html # Main dashboard template
βββ grafana/ # Grafana configuration
β βββ dashboard.json # Grafana dashboard config
βββ logs/ # Application logs
βββ streamlit_app.py # Streamlit application
βββ demo_api.py # API demonstration script
βββ docker-compose.yml # Docker orchestration
βββ Dockerfile # Application image
βββ requirements.txt # Python dependencies
βββ .env # Environment variables
βββ .gitignore # Git ignore rules
βββ README.md # Project documentation
# Install dependencies
pip install -r requirements.txt
# Configure environment variables
cp .env.example .env
# Run database only
docker-compose up -d db
# Run local application
python app/main.py
# Run Streamlit locally
python streamlit_app.py
# Populate database with sample data
python scripts/populate_database.py
# Collect real solar data
python scripts/collect_real_data.py
# Quick data collection
python scripts/quick_collect.py
# View application logs
docker-compose logs -f app
# View database logs
docker-compose logs -f db
# Access application container
docker-compose exec app bash
# Access database
docker-compose exec db psql -U postgres -d previsao_solar
-
Start the services:
.\scripts\start.ps1
-
Access the dashboard: http://localhost:8000/dashboard
-
Collect data:
- Click "Collect Mock Data"
-
Run analyses:
- "Correlation Analysis"
- "Clustering"
- "Generate Predictions"
-
Monitor:
- "View Alerts"
- "High-Intensity Events"
The system has automatic alerts for:
- High-intensity events (SEP > 5.0)
- Data anomalies detected
- Critical event predictions
Logs are stored in:
- Container:
/app/logs/
- Local:
./logs/
- Fork the project
- Create a feature branch
- Commit your changes
- Push to the branch
- Open a Pull Request
This project is under the MIT license.
For issues or questions:
- Check logs:
docker-compose logs -f
- Restart services:
.\scripts\stop.ps1
and.\scripts\start.ps1
- Verify all ports are available
To update the system:
# Stop services
.\scripts\stop.ps1
# Rebuild
docker-compose build --no-cache
# Start
.\scripts\start.ps1