You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+1-5Lines changed: 1 addition & 5 deletions
Original file line number
Diff line number
Diff line change
@@ -21,11 +21,7 @@ A Model Context Protocol (MCP) server implementation for enabling agents to acce
21
21
## How It Works 🔧
22
22
23
23
Digma is an application that pre-processes your observability data to identify issues and track code performance and runtime data.
24
-
To get started:
25
-
1.**Deloy Digma in Your Cluster**: Digma is a K8s native application; follow this [guide](https://docs.digma.ai/digma-developer-guide/installation/central-on-prem-install) to install.
26
-
2.**Send Digma Traces:** Digma accepts any standard OTEL traces; it's easy to extend your data pipeline to [send the observability](https://docs.digma.ai/digma-developer-guide/instrumentation/instrumenting-your-code-for-tracing) data to your local Digma deployment. Alternatively, you can [dual-ship](https://docs.digma.ai/digma-developer-guide/instrumentation/sending-data-to-digma-using-the-datadog-agent) your Datadog traces if you're using DD.
27
-
3.**Follow the instructions below to install the Digma MCP Server with your GenAI agent**
28
-
24
+
Check out our [MCP page](https://digma.ai/mcp/) to get started.
0 commit comments