Skip to content

Production-ready Python service templates with Docker, testing, and best practices built-in

License

Notifications You must be signed in to change notification settings

allcupsnotinprivate/python-microservice-templates

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

5 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

🐍 Python Microservice Templates

πŸš€ Overview

Python Microservice Templates is a production-ready template for building asynchronous REST microservices using FastAPI. It comes pre-configured with PostgreSQL, SQLAlchemy, Alembic for database migrations, background task scheduling via APScheduler, and Docker support. It’s ideal for quickly bootstrapping microservice-based APIs and background workers.

✨ Features

  • βœ… FastAPI β€” a modern, high-performance web framework for building APIs.
  • πŸ“¦ PostgreSQL with the asynchronous psycopg driver and SQLAlchemy 2.0 ORM.
  • 🧠 Dependency injection powered by aioinject.
  • βš™οΈ Alembic β€” built-in database migration support.
  • ⏱️ APScheduler β€” schedule background tasks using cron/date/interval triggers.
  • 🐳 Docker-ready β€” supports both dev and production container builds.
  • πŸ§ͺ Unified quality checks with type-checking (mypy), linting (ruff), and security (bandit).
  • 🎯 Based on Python 3.12 with strict typing configuration.
  • 🧩 Pre-structured for adding REST endpoints (e.g., conversations) and background jobs.
  • πŸ“’ Loguru β€” structured, context-aware logging with automatic correlation for HTTP requests and background jobs.

πŸ› οΈ Tech Stack

Layer Technology
Language Python 3.12
Web Framework FastAPI
Database PostgreSQL
ORM SQLAlchemy 2.0
Migrations Alembic
Scheduler APScheduler
DI Container aioinject
Logging loguru
Async Runtime Uvicorn
Dev Tools Rye, Hatch, Ruff, Mypy, Bandit
Packaging Hatchling
Containerization Docker

🏁 Getting Started

🧰 Prerequisites

Make sure you have the following tools installed on your system:

  • Python 3.12+

  • Docker

  • Rye (Python project management tool)

    • To install Rye:

      curl -sSf https://rye-up.com/get | bash

ℹ️ Optionally, Poetry or Hatch can be used, but this project is configured for Rye.


πŸ“¦ Installation

Clone the repository and install dependencies:

git clone https://github.com/allcupsnotinprivate/python-microservice-templates.git
cd python-microservice-templates

# Sync the environment using Rye
rye sync

Create and configure your environment variables file (e.g., .dev.env). You can start with a copy of the provided example:

cp .env.example .dev.env

▢️ Running the App

With Rye

# Run the FastAPI app with Uvicorn
rye run asgi

With Docker

Make sure Docker is installed and running, then:

# Build and start the containers
docker-compose -f deploy/docker-compose.yml -p pmt up -d

The app should now be accessible at http://localhost:8000


πŸ”§ Environment Configuration

The environment is configured using environment variables with the prefix PMT__. The configuration is loaded via pydantic-settings and supports nested settings using the double underscore __ delimiter.

Main PostgreSQL connection variables:

Variable Description Example value
PMT__INTERNAL__LOG__ENABLE Logger enable flag true (default) or false
PMT__INTERNAL__LOG__LEVEL Minimum log level to output. See Logging Levels 20 (default)
PMT__EXTERNAL__POSTGRES__HOST Database host localhost or database (docker)
PMT__EXTERNAL__POSTGRES__PORT Database port 5432
PMT__EXTERNAL__POSTGRES__DATABASE Database name python_microservice_templates
PMT__EXTERNAL__POSTGRES__USER Database user postgres
PMT__EXTERNAL__POSTGRES__PASSWORD Database password postgres
PMT__EXTERNAL__POSTGRES__AUTOMIGRATE Whether to run auto migrations (bool) true (default) or false

You can use a .env file with the following content (check .env.example):

PMT__EXTERNAL__POSTGRES__HOST=database
PMT__EXTERNAL__POSTGRES__PORT=5432
PMT__EXTERNAL__POSTGRES__DATABASE=python_microservice_templates
PMT__EXTERNAL__POSTGRES__USER=postgres
PMT__EXTERNAL__POSTGRES__PASSWORD=postgres
PMT__EXTERNAL__POSTGRES__AUTOMIGRATE=true

These environment variables are loaded automatically when running the app or migration CLI.


πŸ“’ Logging

This project utilizes Loguru for robust, structured, and context-aware logging. It's pre-configured to enhance observability for both HTTP requests and background tasks.


Key Logging Features:

  1. Structured Logging: logs are often configured to output in JSON format (configurable), making them easily parsable by log management systems.
  2. Contextual Information: logs are automatically enriched with relevant context, such as request IDs or job identifiers.
  3. Ease of Use: loguru provides a simple and powerful API for logging.

HTTP Request Logging

For incoming HTTP requests, logging is enhanced with a correlation ID to group all log entries related to a single request:

  1. X-Request-ID Header: if an incoming request includes an X-Request-ID header, its value is used as the request_id in the log context. This is useful for tracing requests across multiple services.
  2. Auto-generated ID: if no X-Request-ID header is present, a unique request_id is automatically generated (e.g., http-e9a869c4301).

This request_id is available in the log record's extra dictionary and can be included in your log format.


Background Task Logging

For background tasks executed by APScheduler, logs are enriched with specific job context:

  1. Context ID: each execution of a job gets a unique, auto-generated context_id (e.g., job-3dd5e13a136e). This helps distinguish log entries from different runs of the same job.
  2. Job ID (job_id): the unique identifier assigned to the job when it was scheduled (e.g., sync_external_data_job).
  3. Job Label (job_label): a human-readable label or name for the job, if provided during scheduling, to make logs more understandable.

These identifiers are also available in the log record's extra dictionary.


πŸ—„οΈ Database & Migrations

βš™οΈ Initializing the Database

The database service is configured in docker-compose.yml with PostgreSQL running and ready to use with the default credentials:

  • User: postgres
  • Password: postgres
  • Database: python_microservice_templates

The database data is persisted in the Docker volume database_data.


🧩 Running Migrations

This project uses Alembic for database migrations, wrapped by a typer CLI located at:

src/app/infrastructure/database/migrations/cli.py

You can run migration commands directly using:

python src/app/infrastructure/database/migrations/cli.py <command> [options]

For convenience, the migration CLI is registered as a Rye script, so you can also invoke it via Rye’s task runner:

rye run migrations <command> [options]

Database URL Configuration

By default, the CLI reads the database connection URL from the environment variable:

MIGRATIONS_URL_DATABASE

The URL should be in the standard SQLAlchemy format, e.g.:

postgresql://postgres:postgres@localhost:5432/python_microservice_templates

If needed, you can override this URL per command by providing the --database-url option:

rye run migrations <command> --database-url postgresql://postgres:postgres@localhost:5432/python_microservice_templates

Available Commands

Create a Migration Revision (with Autogeneration)

Creates a new migration revision file. By default, Alembic will attempt to autogenerate the migration by comparing models to the database schema.

rye run migrations revision -m "Add new users table"

You can disable autogeneration by passing --autogenerate false if you want to write migrations manually.


Apply Migrations (Upgrade)

Applies migrations to bring your database schema up to the specified revision (defaults to the latest, head).

rye run migrations upgrade

Interactive confirmation:
Before applying migrations, the CLI will prompt for confirmation:
Are you sure you want to perform the upgrade? [y/N]: ...

To skip this prompt (e.g., in CI pipelines or automated scripts), use the -y or --yes flag:

rye run migrations upgrade -y

Roll Back Migrations (Downgrade)

Rolls back migrations down to the specified revision (defaults to one step down, -1).

rye run migrations downgrade

Interactive confirmation:
Before rolling back migrations, the CLI will prompt:
Are you sure you want to perform the upgrade? [y/N]: ...
(Yes, this message is reused but means downgrade confirmation.)

To skip the confirmation prompt, use the -y or --yes flag:

rye run migrations downgrade -y

Show Current Revision

Displays the current migration revision applied in the database:

rye run migrations current

Show Migration History

Lists the full migration history with applied revisions and their order:

rye run migrations history

If you want to automate migrations in your deployment or CI/CD pipeline, it’s recommended to use the -y flag to avoid manual confirmation and ensure smooth, non-interactive execution.


πŸ—‚οΈ Project Structure

This project follows a clean and modular architecture to keep code organized and maintainable. Below is an overview of the main folders and files:

./
β”œβ”€β”€ LICENSE                   # Project license
β”œβ”€β”€ README.md                 # Project overview and docs
β”œβ”€β”€ deploy/                   # Deployment configs
β”‚Β Β  β”œβ”€β”€ Dockerfile            # Docker image build file
β”‚Β Β  └── docker-compose.yml    # Docker compose for local/dev
β”œβ”€β”€ pyproject.toml            # Python project config and dependencies
β”œβ”€β”€ requirements.lock         # Locked production dependencies
β”œβ”€β”€ requirements-dev.lock     # Locked dev dependencies
└── src/
    └── app/                  # Main application code
        β”œβ”€β”€ main.py           # App entry point
        β”œβ”€β”€ asgi.py           # ASGI app instance for async serving
        β”œβ”€β”€ api/              # API layer (REST endpoints, handlers, schemas)
        β”‚   └── rest/
        β”‚   └── v1/       # API versioning, e.g. conversations endpoints
        β”‚   └── tasks/        # Scheduled tasks
        β”œβ”€β”€ configs/          # Application configuration modules
        β”œβ”€β”€ container/        # Dependency injection container & wrappers
        β”œβ”€β”€ exceptions/       # Custom exceptions & error handling
        β”œβ”€β”€ infrastructure/  # DB access, migrations, scheduling, etc.
        β”‚   β”œβ”€β”€ database/
        β”‚   β”‚   └── migrations/ # Alembic migrations and CLI
        β”‚   └── scheduler/    # Scheduler setup and configuration
        β”œβ”€β”€ logs/             # Logging configuration and utilities
        β”‚   β”œβ”€β”€ logger.py     # Loguru setup and main logger configuration
        β”‚   └── types.py      # Type definitions related to logging (e.g., context)
        β”œβ”€β”€ middlewares/      # FastAPI middlewares
        β”‚   └── request_context.py # Middleware for setting up logging context (e.g., X-Request-ID)
        β”œβ”€β”€ models/           # ORM models / domain entities
        β”œβ”€β”€ repositories/     # Data access layer (repositories pattern)
        β”œβ”€β”€ service_layer/    # Business logic and transaction coordination
        └── utils/            # Utility helpers (orm, regex, schemas, timestamps)

πŸ› οΈ Development


πŸ§ͺ Running Tests


πŸ› οΈ Development

🎨 Code Style & Linting

This project uses the following tools to ensure consistent code style and maintain code quality:

  • Ruff β€” for formatting and linting Python code with configuration for 120 character line length, 4-space indent, and double quotes.
  • Mypy β€” for static type checking with strict mode enabled, including plugins for Pydantic and SQLAlchemy.
  • Bandit β€” for security analysis of Python code, configured to skip some known false positives.

You can run these checks individually via Rye scripts or combined:

rye run lint               # Run ruff linter
rye run typecheck          # Run mypy type checks
rye run security-check     # Run Bandit security scan
rye run check-all          # Run all checks: ruff format & lint, mypy, bandit

🚒 Deployment

Deployment is done using Docker with the provided Dockerfile and docker-compose configuration found in the deploy folder.

To build and run the application locally:

docker-compose -f deploy/docker-compose.yml up --build

This will build the image and start the service with environment variables as configured.

For production deployments, customize environment variables and possibly use your own orchestration tooling based on the Docker image and compose files provided.


πŸ› οΈ Troubleshooting

Common issues and resolutions:

  • Database connection errors: Ensure your MIGRATIONS_URL_DATABASE environment variable is correctly set to the PostgreSQL URL before running migrations or starting the app.

  • Migration conflicts or errors: Use the migration CLI (rye run migrations) to check the current revision or history and verify migration state.

  • Linting or type check failures: Run rye run check-all to find formatting, linting, type errors, and fix formatting automatically with ruff.

  • Dependency issues: Use Rye for dependency management; run rye sync to install/update dependencies according to pyproject.toml.


πŸ“œ License

This project is licensed under the MIT License. See the LICENSE file for details.


πŸ™ Acknowledgements

This project was inspired by and builds upon multiple open-source Python and FastAPI templates, leveraging:

  • FastAPI framework for building APIs
  • SQLAlchemy for ORM and database management
  • Alembic for database migrations
  • Rye for modern Python dependency and script management
  • Community tools for linting, type checking, and security auditing

Thanks to all maintainers and contributors of these upstream projects.


About

Production-ready Python service templates with Docker, testing, and best practices built-in

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published