A comprehensive Python library for algorithmic trading, technical analysis, and financial data processing. Built for performance, reliability, and ease of use in both research and production environments.
The overall goal is to get as much value as possible without paying for api access ;) This means managing API query rates, storing as much data as possible locally, and leveraging multiple APIs to get the data we need.
- π Complete Documentation - Comprehensive guides and API reference
- β‘ Quick Start Guide - Get up and running in minutes
- π§ Installation & Setup - Installation instructions and configuration
- π Data APIs Guide - Multi-source financial data fetching
- π Technical Indicators - Complete guide to 50+ indicators with charts and accuracy data
- π Trading Metrics - Comprehensive metrics for risk, performance, and analysis
- πΎ Cache System - Local data storage and optimization
- π¦ Release Process - How to release new versions
- 7+ data providers: Yahoo Finance, Finnhub, Alpha Vantage, FMP, Twelve Data, Polygon, Tiingo
- Automatic rate limiting and error handling
- Smart failover between data sources
- Local caching for performance and cost reduction
- 50+ technical indicators: RSI, MACD, Bollinger Bands, ADX, Stochastic, Williams %R, etc.
- Custom indicators: Fibonacci retracements, volume profiles, market breadth
- Multi-timeframe analysis support
- Signal aggregation and optimization
- Long/short equity signals with multiple strategies
- Options trading signals for calls and puts
- Sentiment-based signals from social media and analyst ratings
- Machine learning ensemble methods
- Modular trading models combining indicators and strategies
- Dynamic position sizing based on volatility
- Automated stop-loss and take-profit levels
- Portfolio-level risk controls
- Correlation-based hedging strategies
- Real-time data feeds with configurable intervals
- Event-driven processing for low-latency signals
- Production logging and monitoring
- Thread-safe operations for concurrent processing
git clone https://github.com/thephiltacular/open_trading_algo.git
cd open_trading_algo
# Complete automated setup
make setup_all
# Or use the setup script
./setup.sh all
git clone https://github.com/thephiltacular/open_trading_algo.git
cd open_trading_algo
# Setup virtual environment and Poetry
make setup_env
# Setup configuration files
make setup_config
# Setup databases and caches
make setup_db
make setup_cache
make setup_influxdb
# Install dependencies
make install_depends
Command | Description |
---|---|
make setup |
Basic setup (env, config, db, cache) |
make setup_dev |
Complete development setup |
make setup_env |
Python virtual environment + Poetry |
make setup_config |
Configuration files from templates |
make setup_db |
SQLite database initialization |
make setup_cache |
Cache directories and files |
make setup_influxdb |
InfluxDB container setup (requires Docker) |
make setup_all |
Everything including tests |
make dev_env |
Activate development environment |
make check_env |
Verify environment configuration |
make status |
Show repository and config status |
For users who prefer scripts over Make:
# Complete setup
./setup.sh all
# Individual components
./setup.sh venv # Virtual environment
./setup.sh config # Configuration files
./setup.sh deps # Dependencies
After setup, configure your API keys:
-
Edit
secrets.env
and add your API keys:FINNHUB_API_KEY=your_finnhub_key FMP_API_KEY=your_fmp_key # ... add other API keys
-
Review configuration files in
config/
directory:db_config.yaml
- Database settingsapi_config.yaml
- API rate limits and settingslive_data_config.yaml
- Live data streaming config
The setup automatically configures three cache types:
- SQLite Cache (
data/cache/sqlite/
): Default lightweight cache for development - Parquet Cache (
data/cache/parquet/
): High-performance columnar storage for analytics - InfluxDB Cache (
data/cache/influxdb/
): Time-series database for high-frequency data
from open_trading_algo.fin_data_apis.fetchers import fetch_yahoo
from open_trading_algo.indicators.indicators import calculate_rsi
from open_trading_algo.indicators.long_signals import rsi_oversold_signal
# Fetch current market data
data = fetch_yahoo(["AAPL", "GOOGL"], ["price", "volume"])
print(f"AAPL: ${data['AAPL']['price']:.2f}")
# Technical analysis with historical data
import yfinance as yf
df = yf.Ticker("AAPL").history(period="6mo")
# Calculate RSI and generate signals
rsi = calculate_rsi(df["Close"])
signals = rsi_oversold_signal(df["Close"])
print(f"Current RSI: {rsi.iloc[-1]:.2f}")
print(f"Active signals: {signals.sum()}")
Calculate comprehensive risk and performance metrics:
from open_trading_algo.indicators.metrics import (
compute_sharpe_ratio, compute_max_drawdown,
compute_volatility_ratio, compute_vwap
)
# Risk and performance analysis
sharpe = compute_sharpe_ratio(returns_df)
max_dd = compute_max_drawdown(price_df)
vol_ratio = compute_volatility_ratio(price_df)
vwap = compute_vwap(price_df)
print(f"Sharpe Ratio: {sharpe.iloc[-1]:.3f}")
print(f"Max Drawdown: {max_dd.iloc[-1]:.3f}")
print(f"Volatility Ratio: {vol_ratio.iloc[-1]:.3f}")
print(f"VWAP: ${vwap.iloc[-1]:.2f}")
open_trading_algo/
βββ π fin_data_apis/ # Multi-source data integration
β βββ fetchers.py # Unified data fetching interface
β βββ rate_limit.py # Automatic rate limiting
β βββ [7 API modules] # Individual data source integrations
βββ π indicators/ # Technical analysis library
β βββ indicators.py # 50+ technical indicators
β βββ metrics.py # 24 trading metrics (Sharpe, drawdown, etc.)
β βββ long_signals.py # Long position signals
β βββ short_signals.py # Short position signals
β βββ options_signals.py # Options trading signals
βββ π€ models/ # Trading strategy models
β βββ base_model.py # Abstract base class for all models
β βββ momentum_model.py # Momentum-based strategies
β βββ mean_reversion_model.py # Mean reversion strategies
β βββ trend_following_model.py # Trend following strategies
βββ πΎ cache/ # Multiple caching implementations
β βββ data_cache.py # SQLite-based cache (default)
β βββ parquet_cache.py # Parquet columnar storage
β βββ timeseries_cache.py # InfluxDB time series database
β βββ setup_influxdb.py # InfluxDB setup utilities
β βββ README_TimeSeries.md # Time series cache documentation
βββ π― sentiment/ # Sentiment analysis integration
βββ βοΈ risk_management.py # Position sizing and risk controls
βββ π signal_optimizer.py # Multi-signal optimization
fin_data_apis/
: Multi-source financial data fetching with rate limitingindicators/
: 50+ technical indicators and 24 trading metricsmodels/
: Trading strategy models and machine learning algorithmscache/
: Multiple caching implementations (SQLite, Parquet, InfluxDB) for different performance and storage needsbacktest/
: Historical strategy testing and Monte Carlo simulationsentiment/
: Social media and analyst sentiment analysisalerts/
: Real-time signal notifications and alertslive/
: Real-time data streaming and event processing
open_trading_algo provides three different caching implementations optimized for different use cases:
- Best for: Getting started quickly, development, small to medium datasets
- Storage: SQLite database with automatic table creation
- Features: OHLCV data, signals storage, thread-safe operations
- Setup: No additional dependencies required
- Performance: Fast for most use cases, excellent for repeated queries
from open_trading_algo.cache.data_cache import DataCache
cache = DataCache() # Uses default SQLite database
cache.store_price_data('AAPL', ohlcv_df)
cached_data = cache.get_price_data('AAPL')
- Best for: Analytical workloads, large datasets, research environments
- Storage: Apache Parquet files with partitioning by ticker
- Features: High compression, fast analytical queries, pandas integration
- Setup: Requires
pyarrow
package - Performance: Superior for complex queries and aggregations
from open_trading_algo.cache.parquet_cache import ParquetCache
cache = ParquetCache() # Uses Parquet files
cache.store_price_data('AAPL', ohlcv_df)
cached_data = cache.get_price_data('AAPL')
- Best for: Production systems, high-frequency data, real-time analytics
- Storage: InfluxDB time series database with automatic compression
- Features: Automated technical indicator calculation, advanced queries, retention policies
- Setup: Requires Docker and InfluxDB container
- Performance: Optimized for time series queries, handles millions of data points
from open_trading_algo.cache.timeseries_cache import TimeSeriesCache
cache = TimeSeriesCache() # Uses InfluxDB
cache.store_price_data('AAPL', ohlcv_df)
# Automatically calculate and store technical indicators
cache.calculate_and_store_metrics('AAPL', indicators=['sma_20', 'rsi_14', 'macd'])
metrics = cache.get_metrics('AAPL')
Feature | SQLite Cache | Parquet Cache | InfluxDB Cache |
---|---|---|---|
Setup Complexity | π’ None | π‘ Low | π΄ Medium |
Performance | π’ Good | π‘ Very Good | π’ Excellent |
Storage Efficiency | π‘ Good | π’ Excellent | π’ Excellent |
Query Flexibility | π’ Good | π‘ Very Good | π’ Excellent |
Time Series Features | π΄ Basic | π‘ Good | π’ Excellent |
Technical Indicators | π΄ Manual | π΄ Manual | π’ Automatic |
Best Use Case | Development/Quick Start | Research/Analytics | Production/Real-time |
All cache types support signal caching to avoid recomputing expensive calculations:
- Signals are only computed once per unique (ticker, timeframe, signal_type) combination
- All signal modules are integrated with the cache system
- On repeated runs, signals are loaded instantly from the database
# 1. Fetch and cache data
from open_trading_algo.cache.data_cache import DataCache
cache = DataCache()
cache.store_price_data("AAPL", df)
# 2. Generate signals
from open_trading_algo.indicators.long_signals import compute_and_cache_long_signals
signals_df = compute_and_cache_long_signals("AAPL", df, "1d")
# 3. Cache signals for reuse
cache.store_signals("AAPL", "1d", "long_trend", signals_df)
# 4. Retrieve cached signals
df = cache.get_signals("AAPL", "1d", "long_trend")
print(df)
All cache types support configuration via config/db_config.yaml
:
# SQLite Cache Configuration
sqlite:
db_path: "/path/to/custom/database.db"
enable_caching: true
# Parquet Cache Configuration
parquet:
cache_dir: "/path/to/parquet/cache"
compression: "snappy"
# InfluxDB Cache Configuration
influxdb:
url: "http://localhost:8086"
token: "your-token"
org: "trading-org"
bucket: "trading-data"
You can easily switch between cache types without changing your application code:
# Switch from SQLite to InfluxDB
from open_trading_algo.cache.timeseries_cache import TimeSeriesCache
# Your existing code works unchanged
cache = TimeSeriesCache()
cache.store_price_data('AAPL', ohlcv_df)
data = cache.get_price_data('AAPL')
See the Cache System Documentation for complete setup and usage instructions.
This project uses GitHub Actions for automated testing and publishing:
- Multi-platform testing: Ubuntu, macOS, Windows
- Multi-Python support: Python 3.9, 3.10, 3.11, 3.12
- Automated testing: pytest with coverage reporting
- Code quality: Black, isort, flake8, mypy
- Security scanning: detect-secrets
- Tag-triggered releases: Push a version tag (
v1.2.3
) to automatically publish to PyPI - Release creation: GitHub releases automatically trigger PyPI publishing
- Quality assurance: All tests must pass before publishing
- Manual publishing: Use
python3 publish.py
for manual control
- Update version in
pyproject.toml
- Run tests locally:
poetry run pytest
- Create GitHub release or push version tag
- GitHub Actions handles the rest automatically
See the Release Process Documentation for detailed instructions.
We welcome contributions from the community! Please read our Contributing Guide for instructions on how to get started, code style, testing, and submitting pull requests.
This project is licensed under the MIT License. See the LICENSE file for details.