A containerized Python application that automatically downloads videos from an RPI camera web interface, concatenates daily footage, and uploads to YouTube. Designed to run in Docker with automatic scheduling.
- π€ Automatic Scheduling: Built-in scheduler runs every 15 minutes (configurable)
- πΉ Video Scraping: Downloads new videos from RPI camera web interface
- π§Ή Server Cleanup: Automatically deletes videos from server after download
- π Daily Processing: Concatenates all videos from a day at configurable time
- πΊ YouTube Upload: Automatically uploads daily compilations to YouTube
- βοΈ Fully Configurable: All settings via environment variables
- π³ Docker Native: Designed to run in containers
- π Retry Logic: Robust error handling with configurable retry attempts
- Clone and setup:
git clone https://github.com/MauroDruwel/RPI-Cam-Web-Interface-Scraper.git
cd RPI-Cam-Web-Interface-Scraper
- Configure environment variables in
docker-compose.yml
:
# Edit the environment section in docker-compose.yml
environment:
- RPICAM_BASE_URL=https://your-camera-server.com/path/
# Adjust scheduling if needed
- RPICAM_SCRAPE_INTERVAL_MINUTES=15
- RPICAM_DAILY_PROCESS_TIME=23:59
- Setup YouTube API credentials:
mkdir secrets
# Copy your client_secrets.json to the secrets/ directory
- Run with Docker:
docker-compose up -d
That's it! The container will automatically:
- Scrape videos every 15 minutes
- Process and upload daily videos at 23:59
- Handle retries and errors gracefully
RPICAM_BASE_URL
: Base URL of your RPI camera web interface
RPICAM_ENABLE_SCHEDULER
: Enable automatic scheduling (default: true)RPICAM_SCRAPE_INTERVAL_MINUTES
: Minutes between scraping runs (default: 15)RPICAM_DAILY_PROCESS_TIME
: Time for daily processing in HH:MM format (default: 23:59)
RPICAM_DATA_DIR
: Directory to store videos (default: /data/videos)RPICAM_MAX_RETRIES
: Maximum retry attempts (default: 5)RPICAM_REQUEST_TIMEOUT
: Request timeout in seconds (default: 30)RPICAM_DOWNLOAD_TIMEOUT
: Download timeout in seconds (default: 60)
YOUTUBE_CLIENT_SECRETS
: Path to YouTube client secrets file (default: client_secrets.json)YOUTUBE_TOKEN_PATH
: Path to save YouTube auth token (default: token.pickle)YOUTUBE_UPLOAD_TITLE_PREFIX
: Prefix for video titles (default: RPiCam)YOUTUBE_UPLOAD_DESCRIPTION
: Description for uploaded videosYOUTUBE_UPLOAD_TAGS
: Comma-separated tags for videosYOUTUBE_UPLOAD_CATEGORY
: YouTube category ID (default: 22)YOUTUBE_PRIVACY_STATUS
: Privacy status (default: unlisted)
# Start with automatic scheduling
docker-compose up -d
# Check logs
docker-compose logs -f rpicam-scraper
# One-time scrape
docker-compose run rpicam-scraper python src/main.py --mode scrape
# One-time daily processing
docker-compose run rpicam-scraper python src/main.py --mode daily
# Process specific date
docker-compose run rpicam-scraper python src/main.py --mode daily --date 2025-08-11
# Disable scheduler and run scrape only
docker-compose run -e RPICAM_ENABLE_SCHEDULER=false rpicam-scraper python src/main.py --mode scrape
# Stop the service
docker-compose down
# View logs
docker-compose logs rpicam-scraper
# Restart service
docker-compose restart rpicam-scraper
# Update and rebuild
git pull
docker-compose build
docker-compose up -d
- Go to Google Cloud Console
- Create a new project or select existing one
- Enable YouTube Data API v3
- Create OAuth 2.0 Client ID credentials for desktop application
- Download the JSON file and save as
secrets/client_secrets.json
βββ src/
β βββ main.py # Main entry point
β βββ rpicam_scraper/ # Main package
β βββ config.py # Configuration management
β βββ scheduler.py # Automatic scheduling
β βββ video_scraper.py # Video scraping and downloading
β βββ video_processor.py # Video concatenation and processing
β βββ youtube_uploader.py # YouTube authentication and upload
βββ tests/ # Test suite (basic Docker tests)
βββ .github/workflows/ # GitHub Actions for CI/CD
βββ Dockerfile # Container configuration
βββ docker-compose.yml # Service definition
βββ requirements.txt # Python dependencies
- Docker and Docker Compose
- YouTube API credentials
- Internet connection
- RPI camera web interface
The scheduler provides detailed logging:
# Follow logs in real-time
docker-compose logs -f rpicam-scraper
# Check last 50 lines
docker-compose logs --tail=50 rpicam-scraper
Log messages include:
- Scheduler start/stop events
- Scraping operations with timestamps
- Daily processing results
- Error messages with retry attempts
- YouTube upload progress
- No videos found: Check
RPICAM_BASE_URL
is correct and accessible - YouTube upload fails: Verify
client_secrets.json
is insecrets/
directory - Permission errors: Ensure
./data
directory is writable - FFmpeg errors: Container includes ffmpeg, but check video file integrity
# Test configuration
docker-compose run rpicam-scraper python -c "
from rpicam_scraper.config import config
config.validate()
print('Config OK')
"
# Test connection to camera
docker-compose run rpicam-scraper python -c "
import requests
from rpicam_scraper.config import config
r = requests.get(config.preview_url, timeout=10)
print(f'Camera response: {r.status_code}')
"
# Run in debug mode
docker-compose run rpicam-scraper python src/main.py --mode scrape
- Fork the repository
- Make your changes
- Test with
docker build -t test .
- Submit a pull request
See LICENSE file for details.