A comprehensive, production-ready NestJS API starter kit with authentication, role-based access control, database integration, and many more enterprise-ready features.
-
π Authentication & Authorization
- JWT Authentication with refresh tokens
- Role-based access control (RBAC)
- Permission-based access control
- CSRF protection
- Rate limiting and throttling
-
π οΈ Core Infrastructure
- Modular architecture following NestJS best practices
- PostgreSQL integration with TypeORM
- In-memory caching support
- Comprehensive logging with Winston
- Health check endpoints with detailed system monitoring
- Request/response validation with class-validator
- API versioning with proper routing structure
-
π Developer Experience
- Swagger/OpenAPI documentation
- Environment configuration with validation
- Automated testing infrastructure (unit, integration, e2e)
- Docker & Docker Compose for local development
- GitHub Actions CI/CD workflows
- Linting and code formatting (ESLint, Prettier)
- Git hooks with Husky and lint-staged
- Conventional commits enforcement
-
π Database Tools
- Database migrations
- Data seeding (development, testing, production)
- Query pagination support
-
π§ Production Ready
- Optimized Docker images with multi-stage builds
- API error handling and standardized responses
- CORS configuration
- Helmet security headers
- Health monitoring and metrics
- Node.js (v20+)
- PNPM (v8+)
- PostgreSQL (v16+)
- Docker & Docker Compose (optional)
-
Clone this repository
-
Install dependencies:
pnpm install
-
Create a
.env
file from the example:cp .env.example .env
-
Update the
.env
file with your configuration -
Start the application:
pnpm start:dev
Run the application with Docker:
# Development mode with hot reload
docker compose build
docker compose up -d
# Production mode
NODE_ENV=production docker compose up -d
The application includes comprehensive health checks at /v1/health
that monitor:
- API status
- Database connectivity
- Disk storage usage
- Memory usage
The application uses NestJS's built-in in-memory caching system for performance optimization.
Configure cache TTL in your .env file:
CACHE_TTL=300 # Time-to-live in seconds (default is 5 minutes)
CSRF protection is enabled by default for all non-GET endpoints. The CSRF token is provided in the response header csrf-token
for any GET request and must be included in subsequent non-GET requests either as:
csrf-token
orx-csrf-token
header_csrf
property in the request body
API rate limiting is configured at 100 requests per minute by default. Customize in .env
:
THROTTLE_TTL=60000
THROTTLE_LIMIT=100
API versioning is enabled through URI paths. Endpoints are accessible at /v1/resource
.
When introducing breaking changes, create new controllers under a new version namespace.
# Run unit tests
pnpm test
# Run e2e tests
pnpm test:e2e
# Generate test coverage
pnpm test:cov
Continuous Integration and Deployment is set up using GitHub Actions:
-
CI Pipeline: Runs on all pushes to
main
anddevelop
branches and all PRs- Linting and type checking
- Unit and integration tests
- Test coverage reporting
-
CD Pipeline:
- Triggered by pushes to
main
(deploys to staging) - Triggered by version tags (deploys to production)
- Builds and pushes Docker images to GitHub Container Registry
- Supports seamless deployment to multiple environments
- Triggered by pushes to
Swagger documentation is available at /docs
when the application is running.
pnpm start:dev
- Start the application in development modepnpm build
- Build the applicationpnpm start:prod
- Start the application in production modepnpm test
- Run testspnpm test:watch
- Run tests in watch modepnpm test:cov
- Run tests with coveragepnpm test:e2e
- Run end-to-end testspnpm lint
- Run lintingpnpm format
- Run code formattingpnpm typecheck
- Run type checkingpnpm migration:generate -- src/database/migrations/MigrationName
- Generate a new migrationpnpm migration:run
- Run migrationspnpm migration:revert
- Revert the last migrationpnpm seed:init
- Seed the database with initial data
src/
βββ app.controller.ts # App controller
βββ app.module.ts # Main application module
βββ app.service.ts # App service
βββ main.ts # Application entry point
βββ config/ # Configuration management
βββ database/ # Database setup and migrations
βββ filters/ # Global exception filters
βββ guards/ # Authentication guards
βββ interceptors/ # HTTP interceptors
βββ lib/ # Shared libraries
β βββ cache/ # Caching implementation
β βββ logger/ # Logging implementation
βββ modules/ # Feature modules
β βββ auth/ # Authentication module
β βββ health/ # Health check module
β βββ shared/ # Shared module
β βββ users/ # Users module
βββ pipes/ # Validation pipes
βββ security/ # Security features
β βββ csrf/ # CSRF protection
βββ seeders/ # Database seeders
This project uses Husky to enforce code quality and consistency through Git hooks:
- pre-commit: Runs linting and formatting on staged files using lint-staged
- pre-push: Runs tests and type checking before pushing to remote
- commit-msg: Validates commit messages against conventional commit format
Husky ensures that all code meets the project's quality standards before being committed or pushed.
We enforce the Conventional Commits specification for commit messages. Each commit message must follow this format:
type(scope): message [#issue-number]
Types allowed (from commitlint.config.js):
feat
: A new featurefix
: A bug fixdocs
: Documentation changesstyle
: Code style changes (formatting, etc.)refactor
: Code changes that neither fix bugs nor add featuresperf
: Performance improvementstest
: Adding or updating testschore
: Changes to the build process, tools, etc.revert
: Reverting a previous commit
Rules:
- Header length must not exceed 72 characters
- A reference to an issue is required
- Type must be one of the allowed types listed above
Examples:
feat(auth): implement refresh token rotation #123
fix(api): resolve race condition in request handler #456
docs(readme): update deployment instructions #789
- Fork the repository
- Create a new feature branch (
git checkout -b feature/amazing-feature
) - Make your changes
- Commit your changes using conventional commits
- Push to the branch (
git push origin feature/amazing-feature
) - Open a Pull Request
MIT