A demonstration inventory management dashboard showcasing modern full-stack architecture with React frontend, FastAPI backend, and PostgreSQL (Lakebase) database. This is a demonstration application only and is not intended for production use. Designed to showcase deployment patterns on Databricks Apps with local development support.
β οΈ Disclaimer: This is a demonstration application designed for educational and showcase purposes only. It is not intended for production use, real business operations, or as a comprehensive enterprise solution. All data, performance metrics, and features are simulated for demonstration purposes.
- π Application Features
- ποΈ Architecture
- π Quick Start
- π Production Deployment with Databricks
- π Project Structure
- π οΈ Development
- π Future Considerations
- βΉοΈ How to get help
- π€ Contributing
- License
- Real-time KPI cards with inventory metrics
- Interactive charts for sales and inventory trends
- Regional filtering and performance analytics
- Basic CRUD operations for products and stock levels
- Low stock alerts and sample reordering suggestions
- Category-based product organization
- Sample order lifecycle with approval workflows
- Basic order processing and fulfillment tracking
- Demo manager approval system with role-based permissions
- Multi-location inventory tracking
- Regional performance comparisons
- Store type categorization (retail vs warehouse)
- Sample Rust-based traffic simulator for database load testing demonstrations
- Example query patterns simulating business operations (SELECT, INSERT, UPDATE)
- Configurable concurrent connections and traffic intensity levels
- Sample traffic patterns: Business Hours, E-Commerce Rush, Nightly Batch processing
- Basic performance metrics: throughput, latency analysis, connection efficiency
This project demonstrates a sample application architecture with centralized configuration management:
- Frontend: React + TypeScript with shadcn/ui components and Vite build system
- Backend: FastAPI with PostgreSQL integration and RESTful API design
- Database: PostgreSQL with automated setup scripts and demo data generation
- Traffic Simulator: Sample Rust application for database load testing demonstrations
- Deployment: Databricks Apps deployment showcase with secret management examples
- Centralized Configuration: Single
.env
file synced across all components - Automated Setup: One-command environment setup and development server startup
- Demo Deployment: Databricks Apps deployment example with secret management
- Sample Dashboard: Interactive analytics demo with inventory management and order tracking
- Node.js 18+ for frontend development
- Python 3.10+ for backend and database setup
- Rust 1.70+ for traffic simulator application (optional)
- PostgreSQL database instance (16+ recommended)
- Databricks CLI for production deployment to a Databricks Apps environment
First, install and authenticate with the Databricks CLI:
# Authenticate with your Databricks workspace and set profile
databricks auth login --host <databricks-workspace-url> --profile <my-profile>
# Verify authentication
databricks auth describe --profile <my-profile>
For installation instructions, see the official Databricks CLI installation guide.
Run the setup script to configure all components:
./setup-env.sh
This script will:
- Create Python virtual environments for backend and database components
- Install all dependencies (npm packages and pip requirements)
- Copy
.env
file from the project root across all components - Verify prerequisites and system compatibility
Edit the root .env
file with your actual configuration:
# Copy from template if not already created
cp env.example .env
# Edit with your settings
vim .env # or your preferred editor
Variable | Description | Example |
---|---|---|
DB_HOST |
PostgreSQL host | localhost |
DB_PORT |
PostgreSQL port | 5432 |
DB_NAME |
Database name | databricks_postgres |
DB_USER |
Database username | your_username |
DB_PASSWORD |
Database password | your_password |
DATABRICKS_HOST |
Databricks workspace URL | https://your-workspace.cloud.databricks.com |
DATABRICKS_TOKEN |
Personal access token / PAT (optional) | your_token |
DATABRICKS_CLIENT_ID |
Databricks client id (instead of PAT) (optional) | your_client_id |
DATABRICKS_CLIENT_SECRET |
Databricks client id (instead of PAT) (optional) | your_client_secret |
NB - we recommend running
setup-env.sh
if any modifications are made to the environment file
If you are using Lakebase Postgres, please ensure that
Postgres Native Role Login = Enabled
- this may require a restart of your Lakebase instance once updated in order to support static username & password based credentials.
We recommend creating a dedicated service account for database interactions:
-- Create dedicated service account
CREATE USER api_service_account WITH
ENCRYPTED PASSWORD 'SomeSecurePassword123'
LOGIN
NOCREATEDB
NOCREATEROLE;
-- Grant database connection
GRANT CONNECT ON DATABASE databricks_postgres TO api_service_account;
-- Create analytics schema (admin only - requires CREATE privileges)
CREATE SCHEMA IF NOT EXISTS analytics;
-- Grant schema usage and create permissions
GRANT USAGE, CREATE ON SCHEMA public TO api_service_account;
GRANT USAGE, CREATE ON SCHEMA analytics TO api_service_account;
-- Grant table permissions
GRANT SELECT, INSERT, UPDATE, DELETE ON ALL TABLES IN SCHEMA public TO api_service_account;
-- Grant sequence permissions (for auto-increment columns)
GRANT USAGE, SELECT ON ALL SEQUENCES IN SCHEMA public TO api_service_account;
-- Set default privileges for future objects
ALTER DEFAULT PRIVILEGES IN SCHEMA public
GRANT SELECT, INSERT, UPDATE, DELETE ON TABLES TO api_service_account;
ALTER DEFAULT PRIVILEGES IN SCHEMA public
GRANT USAGE, SELECT ON SEQUENCES TO api_service_account;
-- Grant analytics schema permissions for materialized views
ALTER DEFAULT PRIVILEGES IN SCHEMA analytics
GRANT SELECT ON TABLES TO api_service_account;
Update your .env
file to use the service account credentials:
DB_USER=api_service_account
DB_PASSWORD=SomeSecurePassword123
Initialize the database with demo data:
cd database
source venv/bin/activate
python demo_setup.py
This creates all necessary tables and populates them with realistic demo data including:
- 50+ beverage products across multiple categories
- 20+ store locations across 4 US regions
- 240K+ orders with realistic approval workflows
- User accounts for store and regional managers
Launch both frontend and backend servers:
./start-dev.sh
The application will be available at:
- Frontend: http://localhost:5173
- Backend API: http://localhost:8000
- API Documentation: http://localhost:8000/docs
For demonstration deployment, database credentials are managed through Databricks secrets.
Set up the required secrets using the Databricks CLI as follows:
# Create secret scope
databricks secrets create-scope brickhouse-scope --profile <my-profile>
# Set database credentials as secrets
databricks secrets put-secret brickhouse-scope db_host --string-value "your-db-host" --profile <my-profile>
databricks secrets put-secret brickhouse-scope db_user --string-value "your-db-username" --profile <my-profile>
databricks secrets put-secret brickhouse-scope db_password --string-value "your-db-password" --profile <my-profile>
# Verify secrets are created
databricks secrets list-secrets brickhouse-scope --profile <my-profile>
Use the deployment script for demonstration:
# Deploy demo app under given Databricks CLI profile and prod target via Databricks Asset Bundles.
./deploy.sh --profile my-profile --target prod
The deployment script demonstrates:
- Building the React frontend for deployment
- Copying frontend assets to backend static files
- Deploying the application bundle to Databricks
- Starting the application using Databricks Apps
brickhouse-brands-demo/
βββ frontend/ # React + TypeScript frontend
β βββ src/ # Source code
β βββ dist/ # Production build output
β βββ package.json # Frontend dependencies
βββ backend/ # FastAPI backend
β βββ app/ # Application modules
β βββ static/ # Frontend assets (after build)
β βββ main.py # FastAPI application
β βββ requirements.txt # Python dependencies
βββ database/ # Database setup and management
β βββ demo_setup.py # Database initialization script
β βββ requirements.txt # Database tool dependencies
βββ traffic-simulator/ # Rust-based database traffic simulator
β βββ src/ # Rust source code
β βββ target/ # Compiled binaries
β βββ Cargo.toml # Rust dependencies
β βββ run_simulation.sh # Quick simulation script
β βββ README.md # Detailed usage instructions
βββ setup-env.sh # Environment setup script
βββ start-dev.sh # Development server startup
βββ deploy.sh # Production deployment script
βββ databricks.yml # Databricks bundle configuration
βββ env.example # Environment variables template
Each component has its own focused README with specific development instructions:
- Frontend: See
frontend/README.md
for React development details - Backend: See
backend/README.md
for FastAPI API documentation - Database: See
database/README.md
for schema and setup details - Traffic Simulator: See
traffic-simulator/README.md
for performance testing and simulation usage
# Setup everything
./setup-env.sh
# Start development servers
./start-dev.sh
# Manual component startup
cd backend && source venv/bin/activate && python startup.py
cd frontend && npm run dev
# Database setup and data generation
cd database && source venv/bin/activate && python demo_setup.py --dry-run
# Traffic simulation (requires Rust and database setup)
cd traffic-simulator && ./run_simulation.sh
This demo application could be extended for learning purposes with:
- Custom Business Logic: Example API extensions with additional requirements
- Enhanced Analytics: Sample reporting and dashboard patterns
- Integration Examples: Demonstrations of external system connections
- Scaling Patterns: Examples of multi-workspace deployment patterns
- Authentication Examples: Sample user management with On-Behalf-Of Auth workflow and Unity Catalog patterns
Note: Any extensions should maintain the demonstration/educational focus and not be used for production workloads.
Databricks support doesn't cover this content. For questions or bugs, please open a GitHub issue and the team will help on a best effort basis. Contributions are more than welcome!
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature
) - Run the setup script to configure your development environment
- Make your changes and test thoroughly
- Commit your changes (
git commit -m 'Add amazing feature'
) - Push to the branch (
git push origin feature/amazing-feature
) - Open a Pull Request
Β© 2025 Databricks, Inc. All rights reserved. The source in this repository is provided subject to the Databricks License.
All included or referenced third party libraries are subject to the licenses set forth below.
library | description | license | source |
---|---|---|---|
react | JavaScript library for building user interfaces | MIT | https://github.com/facebook/react |
typescript | Typed superset of JavaScript | Apache-2.0 | https://github.com/microsoft/TypeScript |
vite | Next generation frontend build tool | MIT | https://github.com/vitejs/vite |
tailwindcss | Utility-first CSS framework | MIT | https://github.com/tailwindlabs/tailwindcss |
@radix-ui/react-* | Low-level UI primitives and components | MIT | https://github.com/radix-ui/primitives |
@tanstack/react-query | Powerful data synchronization for React | MIT | https://github.com/TanStack/query |
axios | Promise-based HTTP client | MIT | https://github.com/axios/axios |
react-router-dom | Declarative routing for React | MIT | https://github.com/remix-run/react-router |
react-hook-form | Performant forms with easy validation | MIT | https://github.com/react-hook-form/react-hook-form |
zod | TypeScript-first schema validation | MIT | https://github.com/colinhacks/zod |
zustand | Small, fast and scalable state management | MIT | https://github.com/pmndrs/zustand |
lucide-react | Beautiful and customizable SVG icons | ISC | https://github.com/lucide-icons/lucide |
recharts | Redefined chart library built with React and D3 | MIT | https://github.com/recharts/recharts |
cmdk | Fast, unstyled command menu | MIT | https://github.com/pacocoursey/cmdk |
class-variance-authority | CSS class variance API | Apache-2.0 | https://github.com/joe-bell/cva |
clsx | Utility for constructing className strings | MIT | https://github.com/lukeed/clsx |
date-fns | Modern JavaScript date utility library | MIT | https://github.com/date-fns/date-fns |
library | description | license | source |
---|---|---|---|
fastapi | Modern, fast web framework for building APIs with Python | MIT | https://github.com/tiangolo/fastapi |
uvicorn | Lightning-fast ASGI server | BSD-3-Clause | https://github.com/encode/uvicorn |
pydantic | Data validation using Python type annotations | MIT | https://github.com/pydantic/pydantic |
psycopg2-binary | PostgreSQL database adapter for Python | LGPL-3.0 | https://github.com/psycopg/psycopg2 |
python-dotenv | Read key-value pairs from .env file | BSD-3-Clause | https://github.com/theskumar/python-dotenv |
python-multipart | Streaming multipart parser for Python | Apache-2.0 | https://github.com/andrew-d/python-multipart |
databricks-sdk | Databricks SDK for Python | Apache-2.0 | https://github.com/databricks/databricks-sdk-py |
databricks-sql-connector | Databricks SQL Connector for Python | Apache-2.0 | https://github.com/databricks/databricks-sql-python |
aiofiles | File support for asyncio | Apache-2.0 | https://github.com/Tinche/aiofiles |
library | description | license | source |
---|---|---|---|
psycopg2-binary | PostgreSQL database adapter for Python | LGPL-3.0 | https://github.com/psycopg/psycopg2 |
faker | Python package that generates fake data | MIT | https://github.com/joke2k/faker |
tqdm | Fast, extensible progress bar for Python | MPL-2.0 & MIT | https://github.com/tqdm/tqdm |
python-dateutil | Extensions to the standard Python datetime module | Apache-2.0 & BSD-3-Clause | https://github.com/dateutil/dateutil |
numpy | Fundamental package for scientific computing | BSD-3-Clause | https://github.com/numpy/numpy |
black | Code formatter for Python | MIT | https://github.com/psf/black |
library | description | license | source |
---|---|---|---|
anyhow | Flexible concrete Error type built on std::error::Error | MIT OR Apache-2.0 | https://github.com/dtolnay/anyhow |
chrono | Date and time library for Rust | MIT OR Apache-2.0 | https://github.com/chronotope/chrono |
clap | Command line argument parser | MIT OR Apache-2.0 | https://github.com/clap-rs/clap |
deadpool-postgres | Dead simple async pool for PostgreSQL | MIT OR Apache-2.0 | https://github.com/bikeshedder/deadpool |
futures | Asynchronous programming for Rust | MIT OR Apache-2.0 | https://github.com/rust-lang/futures-rs |
native-tls | TLS/SSL streams for Rust | MIT OR Apache-2.0 | https://github.com/sfackler/rust-native-tls |
postgres-native-tls | TLS support for postgres via native-tls | MIT OR Apache-2.0 | https://github.com/sfackler/rust-postgres |
rand | Random number generators and other randomness functionality | MIT OR Apache-2.0 | https://github.com/rust-random/rand |
rand_distr | Sampling from random number distributions | MIT OR Apache-2.0 | https://github.com/rust-random/rand |
serde | Serialization framework for Rust | MIT OR Apache-2.0 | https://github.com/serde-rs/serde |
tokio | Asynchronous runtime for Rust | MIT OR Apache-2.0 | https://github.com/tokio-rs/tokio |
tokio-postgres | Native PostgreSQL driver for Rust | MIT OR Apache-2.0 | https://github.com/sfackler/rust-postgres |
tracing | Application-level tracing for Rust | MIT OR Apache-2.0 | https://github.com/tokio-rs/tracing |
tracing-subscriber | Utilities for implementing and composing tracing subscribers | MIT OR Apache-2.0 | https://github.com/tokio-rs/tracing |
uuid | Generate and parse UUIDs | MIT OR Apache-2.0 | https://github.com/uuid-rs/uuid |