Opsmate is an LLM-powered SRE copilot for understanding and solving production problems. By encoding expert troubleshooting patterns and operational knowledge, Opsmate lets users describe problem statements and intentions in natural language, eliminating the need to memorise complex command line or domain-specific tool syntax.
Opsmate can not only perform problem solving autonomously, but also allow human operators to provide feedback and take over the control when needed. It accelerates incident response, reduces mean time to repair (MTTR), and empowers teams to focus on solving problems rather than wrestling with tooling.
- Features
- Installation
- Configuration
- Quick Start
- Advanced Usage
- Use Cases
- Integrations
- Documentation
- Contributing
- License
- 🤖 Natural Language Interface: Run commands using natural language without remembering complex syntax
- 🔍 Advanced Reasoning: Troubleshoot and solve production issues with AI-powered reasoning
- 🔄 Multiple LLM Support: Out of box works for OpenAI, Anthropic, xAI. Easy to extend to other LLMs.
- 🛠️ Multiple Runtimes: Supports various execution environments such as Local, Docker, Kubernetes and remote VMs.
- 🔭 Modern Observability Tooling: Built-in support for Prometheus allows you to create time series dashboards with natural language, and more to come.
- 🧠 Knowledge Management: Ingest and use domain-specific knowledge
- 📈 Web UI & API: Access Opsmate through a web interface or API
- 🔌 Plugin System: Extend Opsmate with custom plugins
Choose your preferred installation method:
The recommended way of installing opsmate is using uv
:
# Using uvx
uv tool install -U opsmate
Other than that, you can also install opsmate using pip
, pipx
or docker
.
# Using pip
pip install -U opsmate
# Using pipx
pipx install opsmate
# or
pipx upgrade opsmate
# Using Docker
docker pull ghcr.io/opsmate-ai/opsmate:latest
alias opsmate="docker run -it --rm --env OPENAI_API_KEY=$OPENAI_API_KEY -v $HOME/.opsmate:/root/.opsmate ghcr.io/opsmate-ai/opsmate:latest"
# From source
git clone git@github.com:opsmate-ai/opsmate.git
cd opsmate
uv build
pipx install ./dist/opsmate-*.whl
Opsmate is powered by large language models. It currently supports:
Set up your API key in an environment variable:
export OPENAI_API_KEY="sk-proj..."
# or
export ANTHROPIC_API_KEY="sk-ant-api03-..."
# or
export XAI_API_KEY="xai-..."
$ opsmate run "what's the gpu of the vm"
# Output: Command and result showing GPU information
$ opsmate solve "what's the k8s distro of the current context"
# Output: Thought process and analysis determining K8s distribution
$ opsmate chat
$ opsmate serve
# Web interface: http://localhost:8080
# API documentation: http://localhost:8080/api/docs
Opsmate can be deployed in production environments using the opsmate-operator
in a Kubernetes cluster, providing:
- Task scheduling via CRDs
- Dedicated HTTPS endpoints and web UI for tasks
- Multi-tenancy support
- Automatic resource management with TTL
- API server for environment management
Check our production documentation for details.
Opsmate supports various use cases:
- Production issue troubleshooting and resolution
- Root cause analysis
- Performance analysis and improvement
- Observability and monitoring setup
- Capacity planning
- On-call engineer assistance
- Infrastructure as Code management
- Routine task automation (CI/CD, backups, updates)
- Knowledge management
- Workflow orchestration
For a comprehensive list of integrations, please refer to the integrations and cookbooks sections.
For comprehensive documentation, visit here.
Contributions are welcome! See our development guide for details.
This project is licensed under the MIT License - see the LICENSE file for details.