Skip to content

Commit ce10a86

Browse files
committed
feat: Mastra instrumentation
1 parent 5077a06 commit ce10a86

File tree

11 files changed

+1128
-0
lines changed

11 files changed

+1128
-0
lines changed
Lines changed: 110 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,110 @@
1+
# @arizeai/openinference-vercel
2+
3+
## 2.0.3
4+
5+
### Patch Changes
6+
7+
- Updated dependencies [ae5cd15]
8+
- @arizeai/openinference-semantic-conventions@1.1.0
9+
- @arizeai/openinference-core@1.0.2
10+
11+
## 2.0.2
12+
13+
### Patch Changes
14+
15+
- Updated dependencies [c4e2252]
16+
- @arizeai/openinference-semantic-conventions@1.0.1
17+
- @arizeai/openinference-core@1.0.1
18+
19+
## 2.0.1
20+
21+
### Patch Changes
22+
23+
- 365a3c2: Updated the OpenInference semantic convention mapping to account for changes to the Vercel AI SDK semantic conventions
24+
25+
## 2.0.0
26+
27+
### Major Changes
28+
29+
- 16a3815: ESM support
30+
31+
Packages are now shipped as "Dual Package" meaning that ESM and CJS module resolution
32+
should be supported for each package.
33+
34+
Support is described as "experimental" because opentelemetry describes support for autoinstrumenting
35+
ESM projects as "ongoing". See https://github.com/open-telemetry/opentelemetry-js/blob/61d5a0e291db26c2af638274947081b29db3f0ca/doc/esm-support.md
36+
37+
### Patch Changes
38+
39+
- Updated dependencies [16a3815]
40+
- @arizeai/openinference-semantic-conventions@1.0.0
41+
- @arizeai/openinference-core@1.0.0
42+
43+
## 1.2.2
44+
45+
### Patch Changes
46+
47+
- Updated dependencies [1188c6d]
48+
- @arizeai/openinference-semantic-conventions@0.14.0
49+
- @arizeai/openinference-core@0.3.3
50+
51+
## 1.2.1
52+
53+
### Patch Changes
54+
55+
- Updated dependencies [710d1d3]
56+
- @arizeai/openinference-semantic-conventions@0.13.0
57+
- @arizeai/openinference-core@0.3.2
58+
59+
## 1.2.0
60+
61+
### Minor Changes
62+
63+
- a0e6f30: Support tool_call_id and tool_call.id
64+
65+
### Patch Changes
66+
67+
- Updated dependencies [a0e6f30]
68+
- @arizeai/openinference-semantic-conventions@0.12.0
69+
- @arizeai/openinference-core@0.3.1
70+
71+
## 1.1.0
72+
73+
### Minor Changes
74+
75+
- a96fbd5: Add readme documentation
76+
77+
### Patch Changes
78+
79+
- Updated dependencies [f965410]
80+
- Updated dependencies [712b9da]
81+
- Updated dependencies [d200d85]
82+
- @arizeai/openinference-semantic-conventions@0.11.0
83+
- @arizeai/openinference-core@0.3.0
84+
85+
## 1.0.0
86+
87+
### Major Changes
88+
89+
- 4f9246f: migrate OpenInferenceSpanProcessor to OpenInferenceSimpleSpanProcessor and OpenInferenceBatchSpanProcessor to allow for filtering exported spans
90+
91+
## 0.1.1
92+
93+
### Patch Changes
94+
95+
- 3b8702a: remove generic log from withSafety and add onError callback
96+
- ff2668c: caputre input and output for tools, fix double count of tokens on llm spans / chains
97+
- Updated dependencies [3b8702a]
98+
- @arizeai/openinference-core@0.2.0
99+
100+
## 0.1.0
101+
102+
### Minor Changes
103+
104+
- 97ca03b: add OpenInferenceSpanProcessor to transform Vercel AI SDK Spans to conform to the OpenInference spec
105+
106+
### Patch Changes
107+
108+
- Updated dependencies [ba142d5]
109+
- @arizeai/openinference-semantic-conventions@0.10.0
110+
- @arizeai/openinference-core@0.1.1
Lines changed: 49 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,49 @@
1+
# OpenInference Mastra
2+
3+
[![npm version](https://badge.fury.io/js/@arizeai%2Fopeninference-mastra.svg)](https://badge.fury.io/js/@arizeai%2Fopeninference-mastra)
4+
5+
This package provides a set of utilities to ingest [Mastra](https://github.com/mastra-ai/mastra) spans into platforms like [Arize](https://arize.com/) and [Phoenix](https://phoenix.arize.com/).
6+
7+
## Installation
8+
9+
```shell
10+
npm install --save @arizeai/openinference-mastra
11+
```
12+
13+
You may also need to install OpenTelemetry in addition to the Mastra packages in your project.
14+
15+
```shell
16+
npm i @opentelemetry/api @opentelemetry/exporter-trace-otlp-proto @arizeai/openinference-semantic-conventions
17+
```
18+
19+
## Usage
20+
21+
`@arizeai/openinference-mastra` provides a set of utilities to help you ingest Mastra spans into platforms and works in conjunction with Mastra's OpenTelemetry support. To get started, you will need to add OpenTelemetry support to your Mastra project according to the [Mastra Observability guide](https://mastra.ai/en/reference/observability/providers), or, follow along with the rest of this README.
22+
23+
To process your Mastra spans add an `OpenInferenceSimpleSpanExporter` or `OpenInferenceBatchSpanExporter` to your OpenTelemetry configuration.
24+
25+
```typescript
26+
import { Mastra } from "@mastra/core";
27+
import { OpenInferenceOTLPTraceExporter } from "@arizeai/openinference-mastra";
28+
29+
export const mastra = new Mastra({
30+
// ... other config
31+
telemetry: {
32+
serviceName: "openinference-mastra-agent", // you can rename this to whatever you want to appear in the Phoenix UI
33+
enabled: true,
34+
export: {
35+
type: "custom",
36+
exporter: new OpenInferenceOTLPTraceExporter({
37+
collectorEndpoint: process.env.PHOENIX_COLLECTOR_ENDPOINT,
38+
apiKey: process.env.PHOENIX_API_KEY,
39+
}),
40+
},
41+
},
42+
});
43+
```
44+
45+
For general details on Mastra's OpenTelemetry support see the [Mastra Observability guide](https://mastra.ai/en/docs/observability/tracing).
46+
47+
## Examples
48+
49+
TODO
Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
/** @type {import('ts-jest').JestConfigWithTsJest} */
2+
module.exports = {
3+
preset: "ts-jest",
4+
testEnvironment: "node",
5+
prettierPath: null,
6+
};
Lines changed: 54 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,54 @@
1+
{
2+
"name": "@arizeai/openinference-mastra",
3+
"version": "1.0.0",
4+
"private": false,
5+
"type": "module",
6+
"types": "dist/esm/index.d.ts",
7+
"description": "OpenInference utilities for ingesting Mastra spans",
8+
"scripts": {
9+
"prebuild": "rimraf dist",
10+
"build": "tsc --build tsconfig.esm.json && tsc-alias -p tsconfig.esm.json",
11+
"postbuild": "echo '{\"type\": \"module\"}' > ./dist/esm/package.json && rimraf dist/test",
12+
"type:check": "tsc --noEmit",
13+
"test": "jest"
14+
},
15+
"exports": {
16+
".": {
17+
"import": "./dist/esm/index.js"
18+
}
19+
},
20+
"files": [
21+
"dist",
22+
"src"
23+
],
24+
"keywords": [
25+
"openinference",
26+
"llm",
27+
"opentelemetry",
28+
"mastra",
29+
"agent"
30+
],
31+
"author": "oss-devs@arize.com",
32+
"license": "Apache-2.0",
33+
"homepage": "https://github.com/arize-ai/openinference/tree/main/js/packages/openinference-mastra",
34+
"repository": {
35+
"type": "git",
36+
"url": "git+https://github.com/Arize-ai/openinference.git"
37+
},
38+
"bugs": {
39+
"url": "https://github.com/Arize-ai/openinference/issues"
40+
},
41+
"dependencies": {
42+
"@arizeai/openinference-core": "workspace:*",
43+
"@arizeai/openinference-semantic-conventions": "workspace:*",
44+
"@arizeai/openinference-vercel": "workspace:*",
45+
"@opentelemetry/core": "^1.25.1",
46+
"@opentelemetry/exporter-trace-otlp-proto": "^0.50.0"
47+
},
48+
"devDependencies": {
49+
"@types/jest": "^29.5.12",
50+
"jest": "^29.7.0",
51+
"@opentelemetry/sdk-trace-base": "^1.19.0",
52+
"@opentelemetry/api": ">=1.0.0 <1.9.0"
53+
}
54+
}
Lines changed: 74 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,74 @@
1+
import type { ReadableSpan } from "@opentelemetry/sdk-trace-base";
2+
import type { ExportResult } from "@opentelemetry/core";
3+
import { OTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-proto";
4+
import { addOpenInferenceAttributesToSpan } from "@arizeai/openinference-vercel/utils";
5+
6+
type ConstructorArgs = {
7+
/**
8+
* The API key to use for the OpenInference Trace Exporter.
9+
* If provided, the `Authorization` header will be added to the request with the value `Bearer ${apiKey}`.
10+
*/
11+
apiKey?: string;
12+
/**
13+
* The endpoint to send the traces to.
14+
*/
15+
collectorEndpoint: string;
16+
/**
17+
* A function that filters the spans to be exported.
18+
* If provided, the span will be exported if the function returns `true`.
19+
*
20+
* @example
21+
* ```ts
22+
* import type { ReadableSpan } from "@opentelemetry/sdk-trace-base";
23+
* import { isOpenInferenceSpan, OpenInferenceOTLPTraceExporter } from "@arizeai/openinference-vercel";
24+
* const spanFilter = (span: ReadableSpan) => {
25+
* // add more span filtering logic here if desired
26+
* // or just use the default isOpenInferenceSpan filter directly
27+
* return isOpenInferenceSpan(span);
28+
* };
29+
* const exporter = new OpenInferenceOTLPTraceExporter({
30+
* apiKey: "...",
31+
* collectorEndpoint: "...",
32+
* spanFilter,
33+
* });
34+
* ```
35+
*/
36+
spanFilter?: (span: ReadableSpan) => boolean;
37+
} & Omit<
38+
NonNullable<ConstructorParameters<typeof OTLPTraceExporter>[0]>,
39+
"url"
40+
>;
41+
42+
export class OpenInferenceOTLPTraceExporter extends OTLPTraceExporter {
43+
private readonly spanFilter?: (span: ReadableSpan) => boolean;
44+
constructor({
45+
apiKey,
46+
collectorEndpoint,
47+
headers,
48+
spanFilter,
49+
...rest
50+
}: ConstructorArgs) {
51+
super({
52+
headers: {
53+
...(apiKey ? { Authorization: `Bearer ${apiKey}` } : {}),
54+
...headers,
55+
},
56+
url: collectorEndpoint,
57+
...rest,
58+
});
59+
this.spanFilter = spanFilter;
60+
}
61+
export(
62+
items: ReadableSpan[],
63+
resultCallback: (result: ExportResult) => void,
64+
) {
65+
let filteredItems = items.map((i) => {
66+
addOpenInferenceAttributesToSpan(i);
67+
return i;
68+
});
69+
if (this.spanFilter) {
70+
filteredItems = filteredItems.filter(this.spanFilter);
71+
}
72+
super.export(filteredItems, resultCallback);
73+
}
74+
}
Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
export * from "./OpenInferenceTraceExporter.js";

0 commit comments

Comments
 (0)