Skip to content

A Python library for algorithmic trading and technical analysis. Integrates 7+ data providers, 50+ indicators, with smart caching to minimize API costs. Built for performance and reliability.

License

Notifications You must be signed in to change notification settings

thephiltacular/open_trading_algo

Repository files navigation

open_trading_algo

A comprehensive Python library for algorithmic trading, technical analysis, and financial data processing. Built for performance, reliability, and ease of use in both research and production environments.

The overall goal is to get as much value as possible without paying for api access ;) This means managing API query rates, storing as much data as possible locally, and leveraging multiple APIs to get the data we need.

πŸš€ Quick Links

Key Features

🎯 Multi-Source Data Integration

  • 7+ data providers: Yahoo Finance, Finnhub, Alpha Vantage, FMP, Twelve Data, Polygon, Tiingo
  • Automatic rate limiting and error handling
  • Smart failover between data sources
  • Local caching for performance and cost reduction

πŸ“ˆ Advanced Technical Analysis

  • 50+ technical indicators: RSI, MACD, Bollinger Bands, ADX, Stochastic, Williams %R, etc.
  • Custom indicators: Fibonacci retracements, volume profiles, market breadth
  • Multi-timeframe analysis support
  • Signal aggregation and optimization

🎯 Intelligent Signal Generation

  • Long/short equity signals with multiple strategies
  • Options trading signals for calls and puts
  • Sentiment-based signals from social media and analyst ratings
  • Machine learning ensemble methods
  • Modular trading models combining indicators and strategies

βš–οΈ Risk Management

  • Dynamic position sizing based on volatility
  • Automated stop-loss and take-profit levels
  • Portfolio-level risk controls
  • Correlation-based hedging strategies

πŸ”„ Live Trading Ready (WIP - this is a future goal)

  • Real-time data feeds with configurable intervals
  • Event-driven processing for low-latency signals
  • Production logging and monitoring
  • Thread-safe operations for concurrent processing

Quick Start

Installation & Setup

Option 1: Automated Setup (Recommended)

git clone https://github.com/thephiltacular/open_trading_algo.git
cd open_trading_algo

# Complete automated setup
make setup_all

# Or use the setup script
./setup.sh all

Option 2: Manual Setup

git clone https://github.com/thephiltacular/open_trading_algo.git
cd open_trading_algo

# Setup virtual environment and Poetry
make setup_env

# Setup configuration files
make setup_config

# Setup databases and caches
make setup_db
make setup_cache
make setup_influxdb

# Install dependencies
make install_depends

Available Setup Commands

Command Description
make setup Basic setup (env, config, db, cache)
make setup_dev Complete development setup
make setup_env Python virtual environment + Poetry
make setup_config Configuration files from templates
make setup_db SQLite database initialization
make setup_cache Cache directories and files
make setup_influxdb InfluxDB container setup (requires Docker)
make setup_all Everything including tests
make dev_env Activate development environment
make check_env Verify environment configuration
make status Show repository and config status

Setup Script Alternative

For users who prefer scripts over Make:

# Complete setup
./setup.sh all

# Individual components
./setup.sh venv      # Virtual environment
./setup.sh config    # Configuration files
./setup.sh deps      # Dependencies

Configuration

After setup, configure your API keys:

  1. Edit secrets.env and add your API keys:

    FINNHUB_API_KEY=your_finnhub_key
    FMP_API_KEY=your_fmp_key
    # ... add other API keys
  2. Review configuration files in config/ directory:

    • db_config.yaml - Database settings
    • api_config.yaml - API rate limits and settings
    • live_data_config.yaml - Live data streaming config

Cache Systems Setup

The setup automatically configures three cache types:

  • SQLite Cache (data/cache/sqlite/): Default lightweight cache for development
  • Parquet Cache (data/cache/parquet/): High-performance columnar storage for analytics
  • InfluxDB Cache (data/cache/influxdb/): Time-series database for high-frequency data

Basic Usage

from open_trading_algo.fin_data_apis.fetchers import fetch_yahoo
from open_trading_algo.indicators.indicators import calculate_rsi
from open_trading_algo.indicators.long_signals import rsi_oversold_signal

# Fetch current market data
data = fetch_yahoo(["AAPL", "GOOGL"], ["price", "volume"])
print(f"AAPL: ${data['AAPL']['price']:.2f}")

# Technical analysis with historical data
import yfinance as yf

df = yf.Ticker("AAPL").history(period="6mo")

# Calculate RSI and generate signals
rsi = calculate_rsi(df["Close"])
signals = rsi_oversold_signal(df["Close"])

print(f"Current RSI: {rsi.iloc[-1]:.2f}")
print(f"Active signals: {signals.sum()}")

πŸ“Š Trading Metrics

Calculate comprehensive risk and performance metrics:

from open_trading_algo.indicators.metrics import (
    compute_sharpe_ratio, compute_max_drawdown,
    compute_volatility_ratio, compute_vwap
)

# Risk and performance analysis
sharpe = compute_sharpe_ratio(returns_df)
max_dd = compute_max_drawdown(price_df)
vol_ratio = compute_volatility_ratio(price_df)
vwap = compute_vwap(price_df)

print(f"Sharpe Ratio: {sharpe.iloc[-1]:.3f}")
print(f"Max Drawdown: {max_dd.iloc[-1]:.3f}")
print(f"Volatility Ratio: {vol_ratio.iloc[-1]:.3f}")
print(f"VWAP: ${vwap.iloc[-1]:.2f}")

πŸ—οΈ Architecture

Project Structure

open_trading_algo/
β”œβ”€β”€ πŸ“Š fin_data_apis/     # Multi-source data integration
β”‚   β”œβ”€β”€ fetchers.py       # Unified data fetching interface
β”‚   β”œβ”€β”€ rate_limit.py     # Automatic rate limiting
β”‚   └── [7 API modules]   # Individual data source integrations
β”œβ”€β”€ πŸ“ˆ indicators/        # Technical analysis library
β”‚   β”œβ”€β”€ indicators.py     # 50+ technical indicators
β”‚   β”œβ”€β”€ metrics.py        # 24 trading metrics (Sharpe, drawdown, etc.)
β”‚   β”œβ”€β”€ long_signals.py   # Long position signals
β”‚   β”œβ”€β”€ short_signals.py  # Short position signals
β”‚   └── options_signals.py # Options trading signals
β”œβ”€β”€ πŸ€– models/            # Trading strategy models
β”‚   β”œβ”€β”€ base_model.py     # Abstract base class for all models
β”‚   β”œβ”€β”€ momentum_model.py # Momentum-based strategies
β”‚   β”œβ”€β”€ mean_reversion_model.py # Mean reversion strategies
β”‚   └── trend_following_model.py # Trend following strategies
β”œβ”€β”€ πŸ’Ύ cache/            # Multiple caching implementations
β”‚   β”œβ”€β”€ data_cache.py     # SQLite-based cache (default)
β”‚   β”œβ”€β”€ parquet_cache.py  # Parquet columnar storage
β”‚   β”œβ”€β”€ timeseries_cache.py # InfluxDB time series database
β”‚   β”œβ”€β”€ setup_influxdb.py # InfluxDB setup utilities
β”‚   └── README_TimeSeries.md # Time series cache documentation
β”œβ”€β”€ 🎯 sentiment/        # Sentiment analysis integration
β”œβ”€β”€ βš–οΈ risk_management.py # Position sizing and risk controls
└── πŸ”„ signal_optimizer.py # Multi-signal optimization

Core Modules

  • fin_data_apis/: Multi-source financial data fetching with rate limiting
  • indicators/: 50+ technical indicators and 24 trading metrics
  • models/: Trading strategy models and machine learning algorithms
  • cache/: Multiple caching implementations (SQLite, Parquet, InfluxDB) for different performance and storage needs
  • backtest/: Historical strategy testing and Monte Carlo simulation
  • sentiment/: Social media and analyst sentiment analysis
  • alerts/: Real-time signal notifications and alerts
  • live/: Real-time data streaming and event processing

Cache System

open_trading_algo provides three different caching implementations optimized for different use cases:

1. SQLite DataCache (Default - Zero Configuration)

  • Best for: Getting started quickly, development, small to medium datasets
  • Storage: SQLite database with automatic table creation
  • Features: OHLCV data, signals storage, thread-safe operations
  • Setup: No additional dependencies required
  • Performance: Fast for most use cases, excellent for repeated queries
from open_trading_algo.cache.data_cache import DataCache

cache = DataCache()  # Uses default SQLite database
cache.store_price_data('AAPL', ohlcv_df)
cached_data = cache.get_price_data('AAPL')

2. Parquet Cache (Columnar Storage)

  • Best for: Analytical workloads, large datasets, research environments
  • Storage: Apache Parquet files with partitioning by ticker
  • Features: High compression, fast analytical queries, pandas integration
  • Setup: Requires pyarrow package
  • Performance: Superior for complex queries and aggregations
from open_trading_algo.cache.parquet_cache import ParquetCache

cache = ParquetCache()  # Uses Parquet files
cache.store_price_data('AAPL', ohlcv_df)
cached_data = cache.get_price_data('AAPL')

3. InfluxDB Time Series Cache (High Performance)

  • Best for: Production systems, high-frequency data, real-time analytics
  • Storage: InfluxDB time series database with automatic compression
  • Features: Automated technical indicator calculation, advanced queries, retention policies
  • Setup: Requires Docker and InfluxDB container
  • Performance: Optimized for time series queries, handles millions of data points
from open_trading_algo.cache.timeseries_cache import TimeSeriesCache

cache = TimeSeriesCache()  # Uses InfluxDB
cache.store_price_data('AAPL', ohlcv_df)

# Automatically calculate and store technical indicators
cache.calculate_and_store_metrics('AAPL', indicators=['sma_20', 'rsi_14', 'macd'])
metrics = cache.get_metrics('AAPL')

Choosing the Right Cache

Feature SQLite Cache Parquet Cache InfluxDB Cache
Setup Complexity 🟒 None 🟑 Low πŸ”΄ Medium
Performance 🟒 Good 🟑 Very Good 🟒 Excellent
Storage Efficiency 🟑 Good 🟒 Excellent 🟒 Excellent
Query Flexibility 🟒 Good 🟑 Very Good 🟒 Excellent
Time Series Features πŸ”΄ Basic 🟑 Good 🟒 Excellent
Technical Indicators πŸ”΄ Manual πŸ”΄ Manual 🟒 Automatic
Best Use Case Development/Quick Start Research/Analytics Production/Real-time

Signal Caching: Avoid Recomputing Signals

All cache types support signal caching to avoid recomputing expensive calculations:

  • Signals are only computed once per unique (ticker, timeframe, signal_type) combination
  • All signal modules are integrated with the cache system
  • On repeated runs, signals are loaded instantly from the database

Signal Generation Pipeline

# 1. Fetch and cache data
from open_trading_algo.cache.data_cache import DataCache
cache = DataCache()
cache.store_price_data("AAPL", df)

# 2. Generate signals
from open_trading_algo.indicators.long_signals import compute_and_cache_long_signals
signals_df = compute_and_cache_long_signals("AAPL", df, "1d")

# 3. Cache signals for reuse
cache.store_signals("AAPL", "1d", "long_trend", signals_df)

# 4. Retrieve cached signals
df = cache.get_signals("AAPL", "1d", "long_trend")
print(df)

Cache Configuration

All cache types support configuration via config/db_config.yaml:

# SQLite Cache Configuration
sqlite:
  db_path: "/path/to/custom/database.db"
  enable_caching: true

# Parquet Cache Configuration
parquet:
  cache_dir: "/path/to/parquet/cache"
  compression: "snappy"

# InfluxDB Cache Configuration
influxdb:
  url: "http://localhost:8086"
  token: "your-token"
  org: "trading-org"
  bucket: "trading-data"

Cache Migration

You can easily switch between cache types without changing your application code:

# Switch from SQLite to InfluxDB
from open_trading_algo.cache.timeseries_cache import TimeSeriesCache

# Your existing code works unchanged
cache = TimeSeriesCache()
cache.store_price_data('AAPL', ohlcv_df)
data = cache.get_price_data('AAPL')

See the Cache System Documentation for complete setup and usage instructions.

πŸ”„ CI/CD & Releases

This project uses GitHub Actions for automated testing and publishing:

πŸ§ͺ Continuous Integration

  • Multi-platform testing: Ubuntu, macOS, Windows
  • Multi-Python support: Python 3.9, 3.10, 3.11, 3.12
  • Automated testing: pytest with coverage reporting
  • Code quality: Black, isort, flake8, mypy
  • Security scanning: detect-secrets

πŸš€ Automated Publishing

  • Tag-triggered releases: Push a version tag (v1.2.3) to automatically publish to PyPI
  • Release creation: GitHub releases automatically trigger PyPI publishing
  • Quality assurance: All tests must pass before publishing
  • Manual publishing: Use python3 publish.py for manual control

πŸ“¦ Release Process

  1. Update version in pyproject.toml
  2. Run tests locally: poetry run pytest
  3. Create GitHub release or push version tag
  4. GitHub Actions handles the rest automatically

See the Release Process Documentation for detailed instructions.

🀝 Contributing

We welcome contributions from the community! Please read our Contributing Guide for instructions on how to get started, code style, testing, and submitting pull requests.

πŸ“ License

This project is licensed under the MIT License. See the LICENSE file for details.

About

A Python library for algorithmic trading and technical analysis. Integrates 7+ data providers, 50+ indicators, with smart caching to minimize API costs. Built for performance and reliability.

Topics

Resources

License

Contributing

Stars

Watchers

Forks

Packages

No packages published