AegisAI is a Backend-as-a-Service (BaaS) platform for developing, deploying, and managing LLM-based AI agents. It unifies hundreds of LLM models, tools, and RAG (Retrieval-Augmented Generation) systems, providing an intuitive console for managing agents, conversation histories, and functional modules.
- Python 3.8+: Core language for server-side AI logic.
- FastAPI: High-performance asynchronous API framework for RESTful services.
- PostgreSQL: Persistent storage for users, agents, and chat histories.
- Redis: Optional caching and session management for high concurrency.
- Docker & Docker Compose: Containerized deployments for consistency and scalability.
- Multi-Tenant Architecture: Isolated environments for multiple users/projects.
- HTML5 & CSS3: Semantic layout and responsive design.
- JavaScript: Dynamic UI components and asynchronous operations.
- Responsive Design: Optimized for desktop and mobile devices.
- LLM Providers: OpenAI, Anthropic, LocalAI, Ollama, LM Studio.
- Plugins / Tools: Google Search, Web Reader, Stock Market Retrieval, and custom tool creation.
- RAG Systems: Advanced retrieval pipelines for contextually-aware responses.
- Unified LLM Platform: Access hundreds of models via single APIs.
- Customizable RAG Pipelines: Enhance agent intelligence with retrieval-augmented generation.
- BaaS-Inspired Workflow: Decoupled server logic and client applications.
- One-Click Deployment: Launch AI agents in production seamlessly.
- Asynchronous Efficiency: FastAPI-powered concurrency for high performance.
- Intuitive UI Console: Manage agents, tools, and workflows easily.
- Multi-Tenant Support: Enterprise-grade deployment with isolated environments.
- Extensible Architecture: Integrate new LLMs, tools, or RAG systems easily.
Prerequisites: Docker, Docker Compose, Git, Python 3.8+
git clone https://github.com/karianne50m/tasking-ai-docker-deploy.git
cd tasking-ai-docker-deploy/docker
cp .env.example .env
# Edit .env for configuration
docker-compose -p aegisai --env-file .env up -d
- Access console:
http://localhost:8080
- Default credentials:
admin / AegisAI321
pip install aegisai
Usage Example:
import aegisai
aegisai.init(api_key='YOUR_API_KEY', host='http://localhost:8080')
assistant = aegisai.assistant.create_assistant(model_id="YOUR_MODEL_ID", memory="naive")
chat = aegisai.assistant.create_chat(assistant_id=assistant.assistant_id)
aegisai.assistant.create_message(assistant_id=assistant.assistant_id, chat_id=chat.chat_id, text="Hello!")
response = aegisai.assistant.generate_message(assistant_id=assistant.assistant_id, chat_id=chat.chat_id)
print(response)
✅ Alternative professional names you could use instead of “AegisAI”:
- Cognivault – emphasizes intelligence and secure storage of knowledge.
- NeuroPilot – emphasizes AI agent orchestration and navigation.
- IntelliForge – emphasizes building and deploying intelligent agents.
- SynapseHub – emphasizes neural/LLM connections and integrations.