Skip to content

feat: Mastra instrumentation #1598

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 27 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 22 commits
Commits
Show all changes
27 commits
Select commit Hold shift + click to select a range
2a0b11b
feat: Mastra instrumentation
cephalization May 13, 2025
eb909ea
changeset
cephalization May 8, 2025
cfb5da9
Update readme
cephalization May 8, 2025
81b8a14
Extract project name from mastra componentName attr
cephalization May 8, 2025
7559559
Replace jest with vitest in mastra package
cephalization May 8, 2025
9ba107e
Restructure exports
cephalization May 8, 2025
09d8d54
Update readme
cephalization May 8, 2025
46ed9b8
Set span kind for mastra spans to agent
cephalization May 9, 2025
7dd05fd
bump lockfile
cephalization May 13, 2025
cb8e3a9
Install optional deps in CI
cephalization May 13, 2025
52476e1
format
cephalization May 13, 2025
c59a634
debug vercel attrs
cephalization May 16, 2025
8f8da8e
feat(openinference-vercel): Instrument multi-part tool calls and tool…
cephalization May 20, 2025
60bc062
fix(openinference-mastra): Do not depend on opentelemetry/core
cephalization May 20, 2025
d05214e
Handle multi content tool messages
cephalization May 20, 2025
7e4c216
changeset
cephalization May 20, 2025
c416de2
Add tests and docs
cephalization May 20, 2025
c6bafea
Update test
cephalization May 20, 2025
8668a07
Format
cephalization May 20, 2025
e3d4515
Update js/packages/openinference-mastra/src/OpenInferenceTraceExporte…
cephalization May 20, 2025
637e046
Update js/packages/openinference-mastra/README.md
cephalization May 20, 2025
0d68d64
do not format test fixtures
cephalization May 20, 2025
349a01a
clarify debug fn
cephalization May 21, 2025
1bfb9ac
Do not rename otel exporter constructor args
cephalization May 21, 2025
53b72c9
Rename variables
cephalization May 21, 2025
218621c
Simplify attribute enrichment logic, rename functions for clarity
cephalization May 21, 2025
098990c
Update otel dependencies to support mastra v0.10
cephalization May 22, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/typescript-CI.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ jobs:
version: 9.8.0
- name: Install Dependencies
working-directory: ./js
run: pnpm install --frozen-lockfile --no-optional -r
run: pnpm install --frozen-lockfile -r
- name: Pre-Build
working-directory: ./js
run: pnpm run -r prebuild
Expand Down
8 changes: 8 additions & 0 deletions js/.changeset/fifty-shirts-design.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
---
"@arizeai/openinference-mastra": major
"@arizeai/openinference-vercel": minor
---

feat: Mastra instrumentation

Initial instrumentation for Mastra, adhering to OpenInference semantic conventions
5 changes: 5 additions & 0 deletions js/.changeset/many-needles-create.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---
"@arizeai/openinference-vercel": minor
---

feat: Instrument tool calls and results from multi-part content messages
2 changes: 2 additions & 0 deletions js/.prettierignore
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
pnpm-lock.yaml
dist
.next
__snapshots__
__fixtures__
110 changes: 110 additions & 0 deletions js/packages/openinference-mastra/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,110 @@
# @arizeai/openinference-vercel

## 2.0.3

### Patch Changes

- Updated dependencies [ae5cd15]
- @arizeai/openinference-semantic-conventions@1.1.0
- @arizeai/openinference-core@1.0.2

## 2.0.2

### Patch Changes

- Updated dependencies [c4e2252]
- @arizeai/openinference-semantic-conventions@1.0.1
- @arizeai/openinference-core@1.0.1

## 2.0.1

### Patch Changes

- 365a3c2: Updated the OpenInference semantic convention mapping to account for changes to the Vercel AI SDK semantic conventions

## 2.0.0

### Major Changes

- 16a3815: ESM support

Packages are now shipped as "Dual Package" meaning that ESM and CJS module resolution
should be supported for each package.

Support is described as "experimental" because opentelemetry describes support for autoinstrumenting
ESM projects as "ongoing". See https://github.com/open-telemetry/opentelemetry-js/blob/61d5a0e291db26c2af638274947081b29db3f0ca/doc/esm-support.md

### Patch Changes

- Updated dependencies [16a3815]
- @arizeai/openinference-semantic-conventions@1.0.0
- @arizeai/openinference-core@1.0.0

## 1.2.2

### Patch Changes

- Updated dependencies [1188c6d]
- @arizeai/openinference-semantic-conventions@0.14.0
- @arizeai/openinference-core@0.3.3

## 1.2.1

### Patch Changes

- Updated dependencies [710d1d3]
- @arizeai/openinference-semantic-conventions@0.13.0
- @arizeai/openinference-core@0.3.2

## 1.2.0

### Minor Changes

- a0e6f30: Support tool_call_id and tool_call.id

### Patch Changes

- Updated dependencies [a0e6f30]
- @arizeai/openinference-semantic-conventions@0.12.0
- @arizeai/openinference-core@0.3.1

## 1.1.0

### Minor Changes

- a96fbd5: Add readme documentation

### Patch Changes

- Updated dependencies [f965410]
- Updated dependencies [712b9da]
- Updated dependencies [d200d85]
- @arizeai/openinference-semantic-conventions@0.11.0
- @arizeai/openinference-core@0.3.0

## 1.0.0

### Major Changes

- 4f9246f: migrate OpenInferenceSpanProcessor to OpenInferenceSimpleSpanProcessor and OpenInferenceBatchSpanProcessor to allow for filtering exported spans

## 0.1.1

### Patch Changes

- 3b8702a: remove generic log from withSafety and add onError callback
- ff2668c: caputre input and output for tools, fix double count of tokens on llm spans / chains
- Updated dependencies [3b8702a]
- @arizeai/openinference-core@0.2.0

## 0.1.0

### Minor Changes

- 97ca03b: add OpenInferenceSpanProcessor to transform Vercel AI SDK Spans to conform to the OpenInference spec

### Patch Changes

- Updated dependencies [ba142d5]
- @arizeai/openinference-semantic-conventions@0.10.0
- @arizeai/openinference-core@0.1.1
129 changes: 129 additions & 0 deletions js/packages/openinference-mastra/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,129 @@
# OpenInference Mastra

[![npm version](https://badge.fury.io/js/@arizeai%2Fopeninference-mastra.svg)](https://badge.fury.io/js/@arizeai%2Fopeninference-mastra)

This package provides a set of utilities to ingest [Mastra](https://github.com/mastra-ai/mastra) spans into platforms like [Arize](https://arize.com/) and [Arize Phoenix](https://phoenix.arize.com/).

## Installation

```shell
npm install --save @arizeai/openinference-mastra
```

A typical Mastra project will already have OpenTelemetry and related packages installed, so you will likely not need to install any additional packages.

## Usage

`@arizeai/openinference-mastra` provides a set of utilities to help you ingest Mastra spans into the Phoenix platform (and any other OpenInference-compatible platform) and works in conjunction with Mastra's OpenTelemetry support. To get started, you will need to add OpenTelemetry support to your Mastra project according to the [Mastra Observability guide](https://mastra.ai/en/reference/observability/providers), or, follow along with the rest of this README.

To process your Mastra spans add an `OpenInferenceOTLPTraceExporter` to your `telemetry` configuration within your `Mastra` instance.

```shell
# Set the Phoenix collector endpoint and API key in your environment
export PHOENIX_COLLECTOR_ENDPOINT="https://localhost:6006/v1/traces"
export PHOENIX_API_KEY="your-api-key"
```

```typescript
import { Mastra } from "@mastra/core";
import {
OpenInferenceOTLPTraceExporter,
isOpenInferenceSpan,
} from "@arizeai/openinference-mastra";

export const mastra = new Mastra({
// ... other config
telemetry: {
serviceName: "openinference-mastra-agent", // you can rename this to whatever you want to appear in the Phoenix UI
enabled: true,
export: {
type: "custom",
exporter: new OpenInferenceOTLPTraceExporter({
collectorEndpoint: process.env.PHOENIX_COLLECTOR_ENDPOINT,
// optional: add bearer auth token if Phoenix or other platform requires it
apiKey: process.env.PHOENIX_API_KEY,
// optional: filter out http, and other node service specific spans
// they will still be exported to Mastra, but not to the target of
// this exporter
spanFilter: isOpenInferenceSpan,
}),
},
},
});
```

For general details on Mastra's OpenTelemetry support see the [Mastra Observability guide](https://mastra.ai/en/docs/observability/tracing).

## Examples

### Weather Agent

To setup the canonical Mastra weather agent example, and then ingest the spans into Phoenix, follow the steps below.

- Create a new Mastra project

```shell
npm create mastra@latest
# answer the prompts, include agent, tools, and the example when asked
cd chosen-project-name
npm install --save @arizeai/openinference-mastra
# export some variables for mastra to use later on
export PHOENIX_COLLECTOR_ENDPOINT="https://localhost:6006/v1/traces"
export PHOENIX_API_KEY="your-api-key"
export OPENAI_API_KEY="your-openai-api-key"
```

- Add the OpenInferenceOTLPTraceExporter to your Mastra project

```typescript
// chosen-project-name/src/index.ts
import { Mastra } from "@mastra/core/mastra";
import { createLogger } from "@mastra/core/logger";
import { LibSQLStore } from "@mastra/libsql";
import {
isOpenInferenceSpan,
OpenInferenceOTLPTraceExporter,
} from "@arizeai/openinference-mastra";

import { weatherAgent } from "./agents";

export const mastra = new Mastra({
agents: { weatherAgent },
storage: new LibSQLStore({
url: ":memory:",
}),
logger: createLogger({
name: "Mastra",
level: "info",
}),
telemetry: {
enabled: true,
serviceName: "weather-agent",
export: {
type: "custom",
exporter: new OpenInferenceOTLPTraceExporter({
apiKey: process.env.PHOENIX_API_KEY,
collectorEndpoint: process.env.PHOENIX_COLLECTOR_ENDPOINT,
spanFilter: isOpenInferenceSpan,
}),
},
},
});
```

- Run the agent

```shell
npm run dev
```

- Send a chat message to the agent in the playground [http://localhost:4111/agents/weatherAgent/chat/](http://localhost:4111/agents/weatherAgent/chat/)

![weather agent chat](./docs/mastra-weather-agent.png)

- After a few moments, you should see the spans for the agent's request and response in Phoenix.
- Not sure how to run the Phoenix collector? [Check out the Phoenix docs](https://docs.arize.com/phoenix/self-hosting/deployment-options/docker#docker).

![weather agent spans](./docs/mastra-weather-agent-spans.png)

You've done it! For next steps, check out the [Mastra docs](https://mastra.ai/en/docs) to learn how to add more agents, tools, and storage options to your project.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
57 changes: 57 additions & 0 deletions js/packages/openinference-mastra/package.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,57 @@
{
"name": "@arizeai/openinference-mastra",
"version": "1.0.0",
"private": false,
"type": "module",
"types": "dist/esm/index.d.ts",
"description": "OpenInference utilities for ingesting Mastra spans",
"scripts": {
"prebuild": "rimraf dist",
"build": "tsc --build tsconfig.esm.json && tsc-alias -p tsconfig.esm.json",
"postbuild": "echo '{\"type\": \"module\"}' > ./dist/esm/package.json && rimraf dist/test",
"type:check": "tsc --noEmit",
"test": "vitest"
},
"exports": {
".": {
"import": "./dist/esm/index.js"
},
"./utils": {
"import": "./dist/esm/utils.js"
}
},
"files": [
"dist",
"src"
],
"keywords": [
"openinference",
"llm",
"opentelemetry",
"mastra",
"agent"
],
"author": "oss-devs@arize.com",
"license": "Apache-2.0",
"homepage": "https://github.com/arize-ai/openinference/tree/main/js/packages/openinference-mastra",
"repository": {
"type": "git",
"url": "git+https://github.com/Arize-ai/openinference.git"
},
"bugs": {
"url": "https://github.com/Arize-ai/openinference/issues"
},
"dependencies": {
"@arizeai/openinference-core": "workspace:*",
"@arizeai/openinference-semantic-conventions": "workspace:*",
"@arizeai/openinference-vercel": "workspace:*",
"@opentelemetry/exporter-trace-otlp-proto": "^0.50.0",
"@opentelemetry/semantic-conventions": "^1.33.0"
},
"devDependencies": {
"@opentelemetry/api": ">=1.0.0 <1.9.0",
"@opentelemetry/core": "^1.25.1",
"@opentelemetry/sdk-trace-base": "^1.19.0",
"vitest": "^3.1.3"
}
}
Loading