Skip to content

neallawson/tasking-ai-docker-deploy

https://www.aegisai.ai

AegisAI: Scalable LLM Agent Platform

Docker Image Version (latest semver) GitHub License PyPI version X (formerly Twitter) URL YouTube Channel Subscribers Docs


AegisAI is a Backend-as-a-Service (BaaS) platform for developing, deploying, and managing LLM-based AI agents. It unifies hundreds of LLM models, tools, and RAG (Retrieval-Augmented Generation) systems, providing an intuitive console for managing agents, conversation histories, and functional modules.

AegisAI Console


Technical Overview

Backend

  • Python 3.8+: Core language for server-side AI logic.
  • FastAPI: High-performance asynchronous API framework for RESTful services.
  • PostgreSQL: Persistent storage for users, agents, and chat histories.
  • Redis: Optional caching and session management for high concurrency.
  • Docker & Docker Compose: Containerized deployments for consistency and scalability.
  • Multi-Tenant Architecture: Isolated environments for multiple users/projects.

Frontend / UI Console

  • HTML5 & CSS3: Semantic layout and responsive design.
  • JavaScript: Dynamic UI components and asynchronous operations.
  • Responsive Design: Optimized for desktop and mobile devices.

Integrations

  • LLM Providers: OpenAI, Anthropic, LocalAI, Ollama, LM Studio.
  • Plugins / Tools: Google Search, Web Reader, Stock Market Retrieval, and custom tool creation.
  • RAG Systems: Advanced retrieval pipelines for contextually-aware responses.

Model Providers

Plugins


Key Features

  1. Unified LLM Platform: Access hundreds of models via single APIs.
  2. Customizable RAG Pipelines: Enhance agent intelligence with retrieval-augmented generation.
  3. BaaS-Inspired Workflow: Decoupled server logic and client applications.
  4. One-Click Deployment: Launch AI agents in production seamlessly.
  5. Asynchronous Efficiency: FastAPI-powered concurrency for high performance.
  6. Intuitive UI Console: Manage agents, tools, and workflows easily.
  7. Multi-Tenant Support: Enterprise-grade deployment with isolated environments.
  8. Extensible Architecture: Integrate new LLMs, tools, or RAG systems easily.

Quickstart with Docker

Prerequisites: Docker, Docker Compose, Git, Python 3.8+

git clone https://github.com/karianne50m/tasking-ai-docker-deploy.git
cd tasking-ai-docker-deploy/docker
cp .env.example .env
# Edit .env for configuration
docker-compose -p aegisai --env-file .env up -d
  • Access console: http://localhost:8080
  • Default credentials: admin / AegisAI321

Python Client SDK

pip install aegisai

Usage Example:

import aegisai

aegisai.init(api_key='YOUR_API_KEY', host='http://localhost:8080')
assistant = aegisai.assistant.create_assistant(model_id="YOUR_MODEL_ID", memory="naive")
chat = aegisai.assistant.create_chat(assistant_id=assistant.assistant_id)
aegisai.assistant.create_message(assistant_id=assistant.assistant_id, chat_id=chat.chat_id, text="Hello!")
response = aegisai.assistant.generate_message(assistant_id=assistant.assistant_id, chat_id=chat.chat_id)
print(response)

Resources

Star AegisAI


Alternative professional names you could use instead of “AegisAI”:

  • Cognivault – emphasizes intelligence and secure storage of knowledge.
  • NeuroPilot – emphasizes AI agent orchestration and navigation.
  • IntelliForge – emphasizes building and deploying intelligent agents.
  • SynapseHub – emphasizes neural/LLM connections and integrations.

About

No description, website, or topics provided.

Resources

License

Code of conduct

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published