-
Notifications
You must be signed in to change notification settings - Fork 94
feat: Mastra instrumentation #1598
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
@arizeai/openinference-core
@arizeai/openinference-instrumentation-beeai
@arizeai/openinference-instrumentation-langchain
@arizeai/openinference-instrumentation-mcp
@arizeai/openinference-instrumentation-openai
@arizeai/openinference-mastra
@arizeai/openinference-semantic-conventions
@arizeai/openinference-vercel
commit: |
67529d3
to
c3b75ee
Compare
13029a8
to
ba978a9
Compare
This fixes transient tracing issues with Mastra due to duplicate dependencies
6d8907d
to
c416de2
Compare
import type { ReadableSpan } from "@opentelemetry/sdk-trace-base"; | ||
import type { ExportResult } from "@opentelemetry/core"; | ||
import { OTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-proto"; | ||
import { addOpenInferenceAttributesToSpan } from "@arizeai/openinference-vercel/utils"; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is there a plan to pull this up and out of vercel since it isn't necessarily tied to that now, feels a little weird to be pulling from that package here. On the other hand i get that it is processing their (vercel's) attributes. But as we talk more about span processors and stuff it feels like this would want to be moved to core or something... just thinking
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@Parker-Stafford - I think because mastra uses vercel under the hood
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
gotcha i just feel like we should have processing utils hoisted into core at some point - this is vercel convention specific but would feel better to not couple the packages i feel like given that more processors are coming i feel, maybe i'm off on that though.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think I agree on maybe genai experimental conventions but the ai conventions are just strange to not live in vercel right now.
js/packages/openinference-mastra/src/OpenInferenceTraceExporter.ts
Outdated
Show resolved
Hide resolved
js/packages/openinference-mastra/src/OpenInferenceTraceExporter.ts
Outdated
Show resolved
Hide resolved
js/packages/openinference-mastra/src/OpenInferenceTraceExporter.ts
Outdated
Show resolved
Hide resolved
js/packages/openinference-mastra/src/OpenInferenceTraceExporter.ts
Outdated
Show resolved
Hide resolved
// child spans with an agent prefix in their name are agent spans | ||
(hasParent && | ||
MASTRA_AGENT_SPAN_NAME_PREFIXES.some((prefix) => | ||
spanName.startsWith(prefix), | ||
)) || | ||
// root spans with a root span prefix in their name are agent spans | ||
(!hasParent && | ||
MASTRA_ROOT_SPAN_NAME_PREFIXES.some((prefix) => | ||
spanName.startsWith(prefix), | ||
)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit might pull these out for readability
const isAgentSpan = mastra_agent...
…r.ts Co-authored-by: Parker Stafford <52351508+Parker-Stafford@users.noreply.github.com>
Co-authored-by: Mikyo King <mikyo@arize.com>
* This function will attempt to detect the closest OpenInference span kind for the given Mastra span, | ||
* based on the span's name and parent span ID. | ||
*/ | ||
const detectOpenInferenceSpanKindFromMastraSpan = ( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just curious (maybe should keep reading haha)
curious about this function it will only ever return agent. also why the need for the check inside here (not saying its wrong) but it feels weird to call this function if the openinference span kind is already set?
js/packages/openinference-mastra/src/OpenInferenceTraceExporter.ts
Outdated
Show resolved
Hide resolved
js/packages/openinference-mastra/src/OpenInferenceTraceExporter.ts
Outdated
Show resolved
Hide resolved
export const addOpenInferenceAttributesToMastraSpan = (span: ReadableSpan) => { | ||
enrichBySpanKind(span); | ||
}; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i'm guessing this is just future proofing since right now all this does is add agent span kind? i would maybe opt to simplify that whole field enrichBySpanKind and stuff and just use addOpenInferenceSpanKind to mastra span or something and add more support when we need to, don't feel too strongly but feel like it might get changed anyways in the future so just build for what we suppor tnow
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I did a lot of renaming and removing of redundant code that was no longer necessary
* | ||
* @param span - The span to check. | ||
*/ | ||
export const isOpenInferenceSpan = (span: ReadableSpan) => { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yeah this makes me feel more strongly about the comment above, move to core/utils or if we want we can have utils pacge as well, might be overkill
To test, follow along with the example at the end of the new README. When you reach the installation step for @arizeai/openinference-mastra, install
npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-mastra@1598
instead.Resolves #1599