-
Notifications
You must be signed in to change notification settings - Fork 94
feat: Mastra instrumentation #1598
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
+2,801
−10
Merged
Changes from 19 commits
Commits
Show all changes
27 commits
Select commit
Hold shift + click to select a range
2a0b11b
feat: Mastra instrumentation
cephalization eb909ea
changeset
cephalization cfb5da9
Update readme
cephalization 81b8a14
Extract project name from mastra componentName attr
cephalization 7559559
Replace jest with vitest in mastra package
cephalization 9ba107e
Restructure exports
cephalization 09d8d54
Update readme
cephalization 46ed9b8
Set span kind for mastra spans to agent
cephalization 7dd05fd
bump lockfile
cephalization cb8e3a9
Install optional deps in CI
cephalization 52476e1
format
cephalization c59a634
debug vercel attrs
cephalization 8f8da8e
feat(openinference-vercel): Instrument multi-part tool calls and tool…
cephalization 60bc062
fix(openinference-mastra): Do not depend on opentelemetry/core
cephalization d05214e
Handle multi content tool messages
cephalization 7e4c216
changeset
cephalization c416de2
Add tests and docs
cephalization c6bafea
Update test
cephalization 8668a07
Format
cephalization e3d4515
Update js/packages/openinference-mastra/src/OpenInferenceTraceExporte…
cephalization 637e046
Update js/packages/openinference-mastra/README.md
cephalization 0d68d64
do not format test fixtures
cephalization 349a01a
clarify debug fn
cephalization 1bfb9ac
Do not rename otel exporter constructor args
cephalization 53b72c9
Rename variables
cephalization 218621c
Simplify attribute enrichment logic, rename functions for clarity
cephalization 098990c
Update otel dependencies to support mastra v0.10
cephalization File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,8 @@ | ||
--- | ||
"@arizeai/openinference-mastra": major | ||
"@arizeai/openinference-vercel": minor | ||
--- | ||
|
||
feat: Mastra instrumentation | ||
|
||
Initial instrumentation for Mastra, adhering to OpenInference semantic conventions |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,5 @@ | ||
--- | ||
"@arizeai/openinference-vercel": minor | ||
--- | ||
|
||
feat: Instrument tool calls and results from multi-part content messages |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,110 @@ | ||
# @arizeai/openinference-vercel | ||
|
||
## 2.0.3 | ||
|
||
### Patch Changes | ||
|
||
- Updated dependencies [ae5cd15] | ||
- @arizeai/openinference-semantic-conventions@1.1.0 | ||
- @arizeai/openinference-core@1.0.2 | ||
|
||
## 2.0.2 | ||
|
||
### Patch Changes | ||
|
||
- Updated dependencies [c4e2252] | ||
- @arizeai/openinference-semantic-conventions@1.0.1 | ||
- @arizeai/openinference-core@1.0.1 | ||
|
||
## 2.0.1 | ||
|
||
### Patch Changes | ||
|
||
- 365a3c2: Updated the OpenInference semantic convention mapping to account for changes to the Vercel AI SDK semantic conventions | ||
|
||
## 2.0.0 | ||
|
||
### Major Changes | ||
|
||
- 16a3815: ESM support | ||
|
||
Packages are now shipped as "Dual Package" meaning that ESM and CJS module resolution | ||
should be supported for each package. | ||
|
||
Support is described as "experimental" because opentelemetry describes support for autoinstrumenting | ||
ESM projects as "ongoing". See https://github.com/open-telemetry/opentelemetry-js/blob/61d5a0e291db26c2af638274947081b29db3f0ca/doc/esm-support.md | ||
|
||
### Patch Changes | ||
|
||
- Updated dependencies [16a3815] | ||
- @arizeai/openinference-semantic-conventions@1.0.0 | ||
- @arizeai/openinference-core@1.0.0 | ||
|
||
## 1.2.2 | ||
|
||
### Patch Changes | ||
|
||
- Updated dependencies [1188c6d] | ||
- @arizeai/openinference-semantic-conventions@0.14.0 | ||
- @arizeai/openinference-core@0.3.3 | ||
|
||
## 1.2.1 | ||
|
||
### Patch Changes | ||
|
||
- Updated dependencies [710d1d3] | ||
- @arizeai/openinference-semantic-conventions@0.13.0 | ||
- @arizeai/openinference-core@0.3.2 | ||
|
||
## 1.2.0 | ||
|
||
### Minor Changes | ||
|
||
- a0e6f30: Support tool_call_id and tool_call.id | ||
|
||
### Patch Changes | ||
|
||
- Updated dependencies [a0e6f30] | ||
- @arizeai/openinference-semantic-conventions@0.12.0 | ||
- @arizeai/openinference-core@0.3.1 | ||
|
||
## 1.1.0 | ||
|
||
### Minor Changes | ||
|
||
- a96fbd5: Add readme documentation | ||
|
||
### Patch Changes | ||
|
||
- Updated dependencies [f965410] | ||
- Updated dependencies [712b9da] | ||
- Updated dependencies [d200d85] | ||
- @arizeai/openinference-semantic-conventions@0.11.0 | ||
- @arizeai/openinference-core@0.3.0 | ||
|
||
## 1.0.0 | ||
|
||
### Major Changes | ||
|
||
- 4f9246f: migrate OpenInferenceSpanProcessor to OpenInferenceSimpleSpanProcessor and OpenInferenceBatchSpanProcessor to allow for filtering exported spans | ||
|
||
## 0.1.1 | ||
|
||
### Patch Changes | ||
|
||
- 3b8702a: remove generic log from withSafety and add onError callback | ||
- ff2668c: caputre input and output for tools, fix double count of tokens on llm spans / chains | ||
- Updated dependencies [3b8702a] | ||
- @arizeai/openinference-core@0.2.0 | ||
|
||
## 0.1.0 | ||
|
||
### Minor Changes | ||
|
||
- 97ca03b: add OpenInferenceSpanProcessor to transform Vercel AI SDK Spans to conform to the OpenInference spec | ||
|
||
### Patch Changes | ||
|
||
- Updated dependencies [ba142d5] | ||
- @arizeai/openinference-semantic-conventions@0.10.0 | ||
- @arizeai/openinference-core@0.1.1 |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,129 @@ | ||
# OpenInference Mastra | ||
|
||
[](https://badge.fury.io/js/@arizeai%2Fopeninference-mastra) | ||
|
||
This package provides a set of utilities to ingest [Mastra](https://github.com/mastra-ai/mastra) spans into platforms like [Arize](https://arize.com/) and [Phoenix](https://phoenix.arize.com/). | ||
|
||
## Installation | ||
|
||
```shell | ||
npm install --save @arizeai/openinference-mastra | ||
``` | ||
|
||
A typical Mastra project will already have OpenTelemetry and related packages installed, so you will likely not need to install any additional packages. | ||
|
||
## Usage | ||
|
||
`@arizeai/openinference-mastra` provides a set of utilities to help you ingest Mastra spans into the Phoenix platform (and any other OpenInference-compatible platform) and works in conjunction with Mastra's OpenTelemetry support. To get started, you will need to add OpenTelemetry support to your Mastra project according to the [Mastra Observability guide](https://mastra.ai/en/reference/observability/providers), or, follow along with the rest of this README. | ||
|
||
To process your Mastra spans add an `OpenInferenceOTLPTraceExporter` to your `telemetry` configuration within your `Mastra` instance. | ||
|
||
```shell | ||
# Set the Phoenix collector endpoint and API key in your environment | ||
export PHOENIX_COLLECTOR_ENDPOINT="https://localhost:6006/v1/traces" | ||
export PHOENIX_API_KEY="your-api-key" | ||
``` | ||
|
||
```typescript | ||
import { Mastra } from "@mastra/core"; | ||
import { | ||
OpenInferenceOTLPTraceExporter, | ||
isOpenInferenceSpan, | ||
} from "@arizeai/openinference-mastra"; | ||
|
||
export const mastra = new Mastra({ | ||
// ... other config | ||
telemetry: { | ||
serviceName: "openinference-mastra-agent", // you can rename this to whatever you want to appear in the Phoenix UI | ||
enabled: true, | ||
export: { | ||
type: "custom", | ||
exporter: new OpenInferenceOTLPTraceExporter({ | ||
collectorEndpoint: process.env.PHOENIX_COLLECTOR_ENDPOINT, | ||
// optional: add bearer auth token if Phoenix or other platform requires it | ||
apiKey: process.env.PHOENIX_API_KEY, | ||
cephalization marked this conversation as resolved.
Show resolved
Hide resolved
|
||
// optional: filter out http, and other node service specific spans | ||
// they will still be exported to Mastra, but not to the target of | ||
// this exporter | ||
spanFilter: isOpenInferenceSpan, | ||
}), | ||
}, | ||
}, | ||
}); | ||
``` | ||
|
||
For general details on Mastra's OpenTelemetry support see the [Mastra Observability guide](https://mastra.ai/en/docs/observability/tracing). | ||
|
||
## Examples | ||
|
||
### Weather Agent | ||
|
||
To setup the canonical Mastra weather agent example, and then ingest the spans into Phoenix, follow the steps below. | ||
|
||
- Create a new Mastra project | ||
|
||
```shell | ||
npm create mastra@latest | ||
# answer the prompts, include agent, tools, and the example when asked | ||
cd chosen-project-name | ||
npm install --save @arizeai/openinference-mastra | ||
# export some variables for mastra to use later on | ||
export PHOENIX_COLLECTOR_ENDPOINT="https://localhost:6006/v1/traces" | ||
export PHOENIX_API_KEY="your-api-key" | ||
export OPENAI_API_KEY="your-openai-api-key" | ||
``` | ||
|
||
- Add the OpenInferenceOTLPTraceExporter to your Mastra project | ||
|
||
```typescript | ||
// chosen-project-name/src/index.ts | ||
import { Mastra } from "@mastra/core/mastra"; | ||
import { createLogger } from "@mastra/core/logger"; | ||
import { LibSQLStore } from "@mastra/libsql"; | ||
import { | ||
isOpenInferenceSpan, | ||
OpenInferenceOTLPTraceExporter, | ||
} from "@arizeai/openinference-mastra"; | ||
|
||
import { weatherAgent } from "./agents"; | ||
|
||
export const mastra = new Mastra({ | ||
agents: { weatherAgent }, | ||
storage: new LibSQLStore({ | ||
url: ":memory:", | ||
}), | ||
logger: createLogger({ | ||
name: "Mastra", | ||
level: "info", | ||
}), | ||
telemetry: { | ||
enabled: true, | ||
serviceName: "weather-agent", | ||
export: { | ||
type: "custom", | ||
exporter: new OpenInferenceOTLPTraceExporter({ | ||
apiKey: process.env.PHOENIX_API_KEY, | ||
cephalization marked this conversation as resolved.
Show resolved
Hide resolved
|
||
collectorEndpoint: process.env.PHOENIX_COLLECTOR_ENDPOINT, | ||
spanFilter: isOpenInferenceSpan, | ||
}), | ||
}, | ||
}, | ||
}); | ||
``` | ||
|
||
- Run the agent | ||
|
||
```shell | ||
npm run dev | ||
``` | ||
|
||
- Send a chat message to the agent in the playground [http://localhost:4111/agents/weatherAgent/chat/](http://localhost:4111/agents/weatherAgent/chat/) | ||
|
||
 | ||
|
||
- After a few moments, you should see the spans for the agent's request and response in Phoenix. | ||
- Not sure how to run the Phoenix collector? [Check out the Phoenix docs](https://docs.arize.com/phoenix/self-hosting/deployment-options/docker#docker). | ||
|
||
 | ||
|
||
You've done it! For next steps, check out the [Mastra docs](https://mastra.ai/en/docs) to learn how to add more agents, tools, and storage options to your project. |
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,57 @@ | ||
{ | ||
"name": "@arizeai/openinference-mastra", | ||
"version": "1.0.0", | ||
"private": false, | ||
"type": "module", | ||
"types": "dist/esm/index.d.ts", | ||
"description": "OpenInference utilities for ingesting Mastra spans", | ||
"scripts": { | ||
"prebuild": "rimraf dist", | ||
"build": "tsc --build tsconfig.esm.json && tsc-alias -p tsconfig.esm.json", | ||
"postbuild": "echo '{\"type\": \"module\"}' > ./dist/esm/package.json && rimraf dist/test", | ||
"type:check": "tsc --noEmit", | ||
"test": "vitest" | ||
cephalization marked this conversation as resolved.
Show resolved
Hide resolved
|
||
}, | ||
"exports": { | ||
".": { | ||
"import": "./dist/esm/index.js" | ||
}, | ||
"./utils": { | ||
"import": "./dist/esm/utils.js" | ||
} | ||
}, | ||
"files": [ | ||
"dist", | ||
"src" | ||
], | ||
"keywords": [ | ||
"openinference", | ||
"llm", | ||
"opentelemetry", | ||
"mastra", | ||
"agent" | ||
], | ||
"author": "oss-devs@arize.com", | ||
"license": "Apache-2.0", | ||
"homepage": "https://github.com/arize-ai/openinference/tree/main/js/packages/openinference-mastra", | ||
"repository": { | ||
"type": "git", | ||
"url": "git+https://github.com/Arize-ai/openinference.git" | ||
}, | ||
"bugs": { | ||
"url": "https://github.com/Arize-ai/openinference/issues" | ||
}, | ||
"dependencies": { | ||
"@arizeai/openinference-core": "workspace:*", | ||
"@arizeai/openinference-semantic-conventions": "workspace:*", | ||
"@arizeai/openinference-vercel": "workspace:*", | ||
cephalization marked this conversation as resolved.
Show resolved
Hide resolved
|
||
"@opentelemetry/exporter-trace-otlp-proto": "^0.50.0", | ||
"@opentelemetry/semantic-conventions": "^1.33.0" | ||
}, | ||
"devDependencies": { | ||
"@opentelemetry/api": ">=1.0.0 <1.9.0", | ||
"@opentelemetry/core": "^1.25.1", | ||
"@opentelemetry/sdk-trace-base": "^1.19.0", | ||
"vitest": "^3.1.3" | ||
} | ||
} |
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.