Skip to content

feat: Mastra instrumentation #1598

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 26 commits into
base: main
Choose a base branch
from
Open

feat: Mastra instrumentation #1598

wants to merge 26 commits into from

Conversation

cephalization
Copy link
Contributor

@cephalization cephalization commented May 8, 2025

To test, follow along with the example at the end of the new README. When you reach the installation step for @arizeai/openinference-mastra, install npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-mastra@1598 instead.

  • Export all spans to Phoenix
  • Optionally export only vercel ai spans to Phoenix
  • Instrument Agent spans
  • Tests

Resolves #1599

Copy link

pkg-pr-new bot commented May 8, 2025

Open in StackBlitz

@arizeai/openinference-core

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-core@1598

@arizeai/openinference-instrumentation-beeai

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-instrumentation-beeai@1598

@arizeai/openinference-instrumentation-langchain

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-instrumentation-langchain@1598

@arizeai/openinference-instrumentation-mcp

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-instrumentation-mcp@1598

@arizeai/openinference-instrumentation-openai

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-instrumentation-openai@1598

@arizeai/openinference-mastra

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-mastra@1598

@arizeai/openinference-semantic-conventions

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-semantic-conventions@1598

@arizeai/openinference-vercel

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-vercel@1598

commit: 218621c

@cephalization cephalization force-pushed the cephalization/mastra branch 2 times, most recently from 67529d3 to c3b75ee Compare May 13, 2025 16:12
@cephalization cephalization changed the base branch from main to cephalization/standardize-node-ci May 13, 2025 16:16
Base automatically changed from cephalization/standardize-node-ci to main May 16, 2025 22:36
@cephalization cephalization force-pushed the cephalization/mastra branch from 13029a8 to ba978a9 Compare May 16, 2025 22:39
@cephalization cephalization force-pushed the cephalization/mastra branch from 6d8907d to c416de2 Compare May 20, 2025 17:33
@cephalization cephalization marked this pull request as ready for review May 20, 2025 17:41
@cephalization cephalization requested a review from a team as a code owner May 20, 2025 17:41
@dosubot dosubot bot added the size:XXL This PR changes 1000+ lines, ignoring generated files. label May 20, 2025
import type { ReadableSpan } from "@opentelemetry/sdk-trace-base";
import type { ExportResult } from "@opentelemetry/core";
import { OTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-proto";
import { addOpenInferenceAttributesToSpan } from "@arizeai/openinference-vercel/utils";
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is there a plan to pull this up and out of vercel since it isn't necessarily tied to that now, feels a little weird to be pulling from that package here. On the other hand i get that it is processing their (vercel's) attributes. But as we talk more about span processors and stuff it feels like this would want to be moved to core or something... just thinking

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Parker-Stafford - I think because mastra uses vercel under the hood

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

gotcha i just feel like we should have processing utils hoisted into core at some point - this is vercel convention specific but would feel better to not couple the packages i feel like given that more processors are coming i feel, maybe i'm off on that though.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think I agree on maybe genai experimental conventions but the ai conventions are just strange to not live in vercel right now.

Comment on lines 54 to 63
// child spans with an agent prefix in their name are agent spans
(hasParent &&
MASTRA_AGENT_SPAN_NAME_PREFIXES.some((prefix) =>
spanName.startsWith(prefix),
)) ||
// root spans with a root span prefix in their name are agent spans
(!hasParent &&
MASTRA_ROOT_SPAN_NAME_PREFIXES.some((prefix) =>
spanName.startsWith(prefix),
))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit might pull these out for readability

const isAgentSpan = mastra_agent...

…r.ts

Co-authored-by: Parker Stafford <52351508+Parker-Stafford@users.noreply.github.com>
Co-authored-by: Mikyo King <mikyo@arize.com>
* This function will attempt to detect the closest OpenInference span kind for the given Mastra span,
* based on the span's name and parent span ID.
*/
const detectOpenInferenceSpanKindFromMastraSpan = (
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just curious (maybe should keep reading haha)

curious about this function it will only ever return agent. also why the need for the check inside here (not saying its wrong) but it feels weird to call this function if the openinference span kind is already set?

Comment on lines 30 to 32
export const addOpenInferenceAttributesToMastraSpan = (span: ReadableSpan) => {
enrichBySpanKind(span);
};
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i'm guessing this is just future proofing since right now all this does is add agent span kind? i would maybe opt to simplify that whole field enrichBySpanKind and stuff and just use addOpenInferenceSpanKind to mastra span or something and add more support when we need to, don't feel too strongly but feel like it might get changed anyways in the future so just build for what we suppor tnow

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I did a lot of renaming and removing of redundant code that was no longer necessary

*
* @param span - The span to check.
*/
export const isOpenInferenceSpan = (span: ReadableSpan) => {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah this makes me feel more strongly about the comment above, move to core/utils or if we want we can have utils pacge as well, might be overkill

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
size:XXL This PR changes 1000+ lines, ignoring generated files.
Projects
Status: No status
Development

Successfully merging this pull request may close these issues.

[feature request] [js] Mastra Instrumentation
3 participants