A fast and flexible API response caching library for Python that helps you improve application performance by caching expensive operations and API responses. PocketCache provides a simple yet powerful interface with support for multiple storage backends, serialization formats, and both synchronous and asynchronous operations.
- Simple Interface: Easy to use with an intuitive API that feels natural in Python
- Flexible Storage: Choose from memory, Redis, or filesystem storage, or implement your own backend
- Efficient Serialization: Built-in support for JSON and Pickle serialization with extensible serializer interface
- Production Ready: Thoroughly tested, type-safe, and used in production environments
- Performance Focused: Optimized for high-throughput scenarios with minimal overhead
- Developer Friendly: Comprehensive documentation, type hints, and extensive examples
- API Response Caching: Cache external API responses to reduce latency and API costs
- Database Query Results: Store frequently accessed database query results
- Computation Results: Cache results of expensive computations
- Session Data: Store user session data with automatic expiration
- Rate Limiting: Implement rate limiting using cache counters
- Distributed Caching: Share cache across multiple application instances using Redis
PocketCache is designed with performance in mind:
- Minimal Overhead: Less than 1ms overhead per cache operation in memory backend
- Thread-Safe: All operations are thread-safe by default
- Memory Efficient: Smart memory management with automatic cleanup of expired items
- Configurable TTL: Fine-grained control over cache item expiration
- Failure Resilient: Graceful handling of backend failures with optional fallbacks
- Monitoring Ready: Built-in support for cache statistics and monitoring
Memory Backend (operations/second):
- Get: ~500,000
- Set: ~300,000
- Delete: ~400,000
Redis Backend (operations/second):
- Get: ~50,000
- Set: ~40,000
- Delete: ~45,000
PocketCache is used in production by various applications:
- High-traffic web applications serving millions of requests
- Data processing pipelines caching intermediate results
- Microservices architectures sharing cached data
- API gateways implementing response caching
- Machine learning applications caching model predictions
Here's how PocketCache compares to other popular caching solutions:
- More flexible with multiple serialization options
- Not tied to Django framework
- Better support for async operations
- More extensive type hints
- Similar familiar API
- Support for distributed caching (Redis)
- Better serialization options
- More extensive documentation
- Similar memory efficiency
- More complex but more feature-rich
- Higher-level abstraction
- Multiple backend support
- Built-in serialization
- Simpler API
- More Pythonic interface
- More modern API
- Better Python integration
- Multiple backend support
- Built-in type safety
- More extensive feature set
- Multiple backend support (Memory, Redis, File System)
- Flexible serialization (JSON, Pickle)
- Decorator-based caching
- TTL (Time-To-Live) support
- Async support
- Type hints
- Extensive test coverage
- Comprehensive documentation
pip install pocket-cache
from pocket_cache import Cache
from pocket_cache.utils.decorators import cached
# Create a cache instance
cache = Cache()
# Basic usage
cache.set("my_key", "my_value", ttl=300) # Cache for 5 minutes
value = cache.get("my_key")
# Decorator usage
@cached(ttl=300)
def expensive_operation(x):
# This result will be cached for 5 minutes
return x * 2
# Using with Redis backend
from pocket_cache.backends.redis import RedisCache
from redis import Redis
redis_client = Redis(host='localhost', port=6379)
cache = Cache(backend=RedisCache(redis_client))
PocketCache can be configured with different backends and serializers:
from pocket_cache import Cache
from pocket_cache.backends.redis import RedisCache
from pocket_cache.serializers.pickle import PickleSerializer
from datetime import timedelta
cache = Cache(
backend=RedisCache(redis_client),
serializer=PickleSerializer(),
default_ttl=timedelta(minutes=5)
)
from pocket_cache.backends.base import BaseCacheBackend
from datetime import timedelta
class MyCustomBackend(BaseCacheBackend):
def get(self, key: str) -> Optional[bytes]:
# Implementation
pass
def set(self, key: str, value: bytes, ttl: timedelta) -> None:
# Implementation
pass
def delete(self, key: str) -> None:
# Implementation
pass
def clear(self) -> None:
# Implementation
pass
from pocket_cache.serializers.base import BaseSerializer
import msgpack
class MsgPackSerializer(BaseSerializer):
def serialize(self, value: Any) -> bytes:
return msgpack.packb(value)
def deserialize(self, value: bytes) -> Any:
return msgpack.unpackb(value)
- Clone the repository:
git clone https://github.com/yourusername/pocket-cache.git
cd pocket-cache
- Create and activate a virtual environment:
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
- Install development dependencies:
pip install -e ".[dev,test,docs]"
- Install pre-commit hooks:
pre-commit install
# Run all tests
pytest
# Run with coverage
pytest --cov=pocket_cache
# Run specific test file
pytest tests/test_cache.py
The project uses several tools to maintain code quality:
black
for code formattingisort
for import sortingflake8
for style guide enforcementmypy
for static type checking
Run them using:
# Format code
black src tests examples
isort src tests examples
# Check types
mypy src tests examples
# Check style
flake8 src tests examples
We welcome contributions! Please see our Contributing Guide for details.
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature
) - Run the tests (
pytest
) - Commit your changes (
git commit -m 'Add some amazing feature'
) - Push to the branch (
git push origin feature/amazing-feature
) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.