Skip to content

LiteMaaS is an enterprise-grade platform for managing AI model subscriptions, API keys, and usage tracking. It seamlessly integrates with LiteLLM to provide a unified interface for accessing multiple AI models.

License

Notifications You must be signed in to change notification settings

rh-aiservices-bu/litemaas

Repository files navigation

Octobean logo

LiteMaaS - Models as a Service Platform

LiteMaaS is a proof-of-concept application for managing LLM subscriptions, API keys, and usage tracking. It seamlessly integrates with LiteLLM to provide a unified interface for accessing multiple LLMs with comprehensive budget management and team collaboration features.

🎟️ Key Benefits

  • Self-Service: Users can independently manage subscriptions and API keys
  • Multi-Model Access: One API key can access multiple models
  • Budget Control: Set spending limits at user, team, and API key levels
  • Usage Analytics: Real-time tracking of token usage and costs
  • Team Collaboration: Share budgets and manage access across teams (upcoming feature)
  • Enterprise Security: OAuth2 authentication with OpenShift support
  • Easy Deployment and Usage: Automated deployment on OpenShift, Patternfly 6 UI

πŸ“Έ Screenshots

Model Discovery

model-discovery.png

Subscriptions Management

subscriptions-management.png

API Keys Management

api-keys-management.png

API Key Details

api-key-details.png

Usage Statistics

usage-statistics.png

πŸš€ Quick Start

# Install dependencies
npm install

# Start development servers (both backend and frontend)
npm run dev

# Backend only (http://localhost:8081)
npm run dev:backend

# Frontend only (http://localhost:3000)
npm run dev:frontend

πŸ“‹ Prerequisites

  • Node.js 18.x or 20.x
  • PostgreSQL 12+
  • npm 8+
  • LiteLLM instance (optional, has mock fallback)

πŸ—οΈ Architecture

LiteMaaS is a monorepo using npm workspaces with two main packages:

  • Backend (@litemaas/backend): High-performance Fastify API server with PostgreSQL
  • Frontend (@litemaas/frontend): Modern React application with PatternFly 6 UI

🎯 System Architecture

graph TB
    subgraph "Frontend"
        UI[React + PatternFly 6]
    end

    subgraph "Backend Services"
        API[Fastify API Server]
        Auth[OAuth2/JWT Auth]
        DB[(PostgreSQL)]
    end

    subgraph "External Services"
        LiteLLM[LiteLLM Gateway]
        OAuth[OpenShift OAuth]
    end

    UI <--> API
    API <--> Auth
    API <--> DB
    API <--> LiteLLM
    Auth <--> OAuth

    style UI fill:#e3f2fd
    style API fill:#fff3e0
    style LiteLLM fill:#f3e5f5
Loading

πŸ› οΈ Development Setup

  1. Clone the repository

    git clone https://github.com/rh-aiservices-bu/litemaas.git
    cd litemaas
  2. Set up environment variables

    # Copy example environment files
    cp backend/.env.example backend/.env
    cp frontend/.env.example frontend/.env
  3. Configure your environment

    • See Configuration Guide for all available options
    • Minimum required: DATABASE_URL, JWT_SECRET, OAuth credentials
  4. Start PostgreSQL (using Docker)

    docker compose -f dev-tools/compose.yaml up -d postgres
  5. Run the application

    npm run dev

πŸ“š Documentation

Quick Links

πŸ§ͺ Testing

# Run all tests
npm run test

# Backend tests
npm run test:backend

# Frontend tests
npm run test:frontend

# E2E tests
npm run test:e2e

# Performance tests
npm run test:perf

πŸš€ Deployment

LiteMaaS supports multiple deployment methods:

OpenShift/Kubernetes Deployment (Recommended for Production)

Deploy to OpenShift or Kubernetes using the provided manifests and Kustomize configuration:

# Deploy to OpenShift
oc apply -k deployment/openshift/

# Access at:
# - LiteMaaS: https://litemaas-<namespace>.<cluster-domain>
# - LiteLLM UI: https://litellm-<namespace>.<cluster-domain>

πŸ“š Complete Guide: OpenShift Deployment Guide

Container Deployment (Development/Testing)

Deploy using containers with Docker Compose for local development:

# Local deployment with containers
docker-compose up -d

# Or with Podman
podman-compose up -d

For detailed container deployment instructions, see Container Deployment Guide.

Manual Deployment

For traditional deployment instructions, see the Production Guide.

Quick Production Build

# Build both packages
npm run build

# Build output locations:
# - Backend: backend/dist/
# - Frontend: frontend/dist/

Container Image Build

Build container images for deployment using the automated build script:

πŸ“¦ Custom Registry: To use a different container registry, edit the REGISTRY variable in scripts/build-containers.sh before building:

# Change this line in scripts/build-containers.sh:
REGISTRY="your-registry.com/your-org"
# Build both backend and frontend container images
npm run build:containers

# Build and push to registry (default: quay.io/rh-aiservices-bu)
npm run build:containers:push

# Push existing images to registry
npm run push:containers

Features:

  • 🏷️ Centralized versioning - Uses version from root package.json
  • 🐳 Multi-runtime support - Works with Docker or Podman
  • πŸš€ Flexible workflow - Separate build and push operations
  • πŸ—οΈ Platform support - Build for different architectures
  • πŸ”§ Registry flexibility - Easy to configure for any container registry

Script options:

# Build without cache
./scripts/build-containers.sh --no-cache

# Build for ARM64
./scripts/build-containers.sh --platform linux/arm64

# Build with local tags only (no registry prefix)
./scripts/build-containers.sh --local

🀝 Contributing

We welcome contributions! Please see our Contributing Guide for details on:

  • Code style and conventions
  • Development workflow
  • Submitting pull requests
  • Reporting issues

πŸ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

πŸ”— Related Projects

  • LiteLLM - The AI model proxy that LiteMaaS integrates with
  • PatternFly - The UI framework used in the frontend

πŸ’¬ Getting Help


Built with ❀️ by the CAI Team

About

LiteMaaS is an enterprise-grade platform for managing AI model subscriptions, API keys, and usage tracking. It seamlessly integrates with LiteLLM to provide a unified interface for accessing multiple AI models.

Resources

License

Contributing

Stars

Watchers

Forks

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •