|
| 1 | +# Digma MCP Server |
| 2 | + |
| 3 | +A Model Context Protocol (MCP) server implementation for enabling agents to access observability insights using [Digma](https://digma.ai). |
| 4 | + |
| 5 | +## Key Features 🚀 |
| 6 | + |
| 7 | +* **🗣️ Observability-assisted code reviews:** Instruct your LLM to perform Unity tasks. |
| 8 | +* **🔎 Find code inefficiencies with dynamic code analysis:** Identify issues in the code/queries that are slowing the app down |
| 9 | +* **🔭 Utilize code runtime usage data from distributed tracing:** Check for breaking changes or generated relevant tests |
| 10 | + |
| 11 | +## Example prompts 💬 |
| 12 | + |
| 13 | +* `help me review the code changes in this branch by looking at related runtime issues` |
| 14 | +* `I want to improve the performance of this app. What are the three most severe issues I can fix?` |
| 15 | +* `I'm making changes to this function, based on runtime data. What other services and code would be affected?` |
| 16 | +* `Are there any new issues in this code based on the Staging environment?` |
| 17 | +* `Which database queries have the most impact on the application performance?` |
| 18 | + |
| 19 | +--- |
| 20 | + |
| 21 | +## How It Works 🔧 |
| 22 | + |
| 23 | +Digma is an application that pre-processes your observability data to identify issues and track code performance and runtime data. |
| 24 | +To get started: |
| 25 | +1. **Deloy Digma in Your Cluster**: Digma is a K8s native application; follow this [guide](https://docs.digma.ai/digma-developer-guide/installation/central-on-prem-install) to install. |
| 26 | +2. **Send Digma Traces:** Digma accepts any standard OTEL traces; it's easy to extend your data pipeline to [send the observability](https://docs.digma.ai/digma-developer-guide/instrumentation/instrumenting-your-code-for-tracing) data to your local Digma deployment. Alternatively, you can [dual-ship](https://docs.digma.ai/digma-developer-guide/instrumentation/sending-data-to-digma-using-the-datadog-agent) your Datadog traces if you're using DD. |
| 27 | +3. **Follow the instructions below to install the Digma MCP Server with your GenAI agent** |
| 28 | + |
| 29 | +--- |
| 30 | + |
| 31 | +## Installation ⚙️ |
| 32 | + |
| 33 | +Configure your MCP Client (Claude, Cursor, etc.) to include the Digma MCP |
| 34 | +The Digma MCP is included as a remote SSE server in your Digma deployment. You can configure it using its URL in your client, or use an MCP tool such as [SuperGateway](https://github.com/supercorp-ai/supergateway) to run it as a command tool. |
| 35 | +The MCP URL path is composed of the Digma API Key as follows: |
| 36 | +`https://<DIGMA_URL>/<DIGMA_API_TOKEN>>/sse` |
| 37 | + |
| 38 | +### Example MCP XML |
| 39 | + |
| 40 | +If your client supports SSE servers, you can use the following syntax: |
| 41 | + |
| 42 | + ```json |
| 43 | + { |
| 44 | + "mcpServers": { |
| 45 | + "digma": { |
| 46 | + "url": "https://<DIGMA_URL>/DIGMA_API_TOKEN>/sse", |
| 47 | + |
| 48 | + } |
| 49 | + // ... other servers might be here ... |
| 50 | + } |
| 51 | + } |
| 52 | +``` |
| 53 | + |
| 54 | +To use the MCP server as a command tool, use the [SuperGateway](https://github.com/supercorp-ai/supergateway) tool to bridge to the URL as seen below: |
| 55 | + |
| 56 | + ```json |
| 57 | + { |
| 58 | + "digma": { |
| 59 | + "command": "npx", |
| 60 | + "args": [ |
| 61 | + "-y", |
| 62 | + "supergateway", |
| 63 | + "--sse", |
| 64 | + "https://<DIGMA_URL>/DIGMA_API_TOKEN>/sse" |
| 65 | + ] |
| 66 | + } |
| 67 | + } |
| 68 | +``` |
| 69 | + |
| 70 | + |
| 71 | +## License 📜 |
| 72 | + |
| 73 | +MIT License. See [LICENSE](https://www.google.com/url?sa=E&q=https%3A%2F%2Fgithub.com%2Fjustinpbarnett%2Funity-mcp%2Fblob%2Fmaster%2FLICENSE) file. |
| 74 | + |
0 commit comments