-
-
Notifications
You must be signed in to change notification settings - Fork 1.7k
feat(node): Add OpenAI integration #17022
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
size-limit report 📦
|
dev-packages/node-integration-tests/suites/tracing/openai/scenario.mjs
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do the types and utils for openai have to live in @sentry/core
? It's node only, I'd rather that they are in the node package.
The only thing that should be in core are the openai-attributes
. I would prefer if we renamed that file to gen-ai-attributes.ts
and made it more generic (the vercel ai attributes file can just re-export the attributes from gen-ai-attributes.ts
).
We put the vercelai stuff in @sentry/core
because it can also be used by the cloudflare and vercel-edge sdks.
The types and utils are primarily used in Sure thing, I will rename it to gen-ai-attributes.ts so we can make use of it later |
we're using |
Oh great call, didn't really know openai sdk itself only works on the node runtime, will move! |
@AbhiPrasad are you sure about the library not running on non-node runtimes? According to their README vercel-edge and cloudflare is supported. We discussed on Monday in person that we'll start with this being a node instrumentation but that we want to have it available on vercel-edge and cloudflare eventually. We can't use |
This is my mistake! looks like with the latest major version they don't rely on node types for anything anymore (https://github.com/openai/openai-node/blob/2d0c448912f0adc1caa5052937971bc7bd6daaee/MIGRATION.md?plain=1#L23-L26). Sorry for the confusion. I still think this should live in |
Just a follow up on this, after syncing with Abhi, we have decided to keep things in core for now as we are aiming to tackle non-node environment very soon |
packages/node/src/integrations/tracing/openai/instrumentation.ts
Outdated
Show resolved
Hide resolved
packages/node/src/integrations/tracing/openai/instrumentation.ts
Outdated
Show resolved
Hide resolved
}); | ||
} as unknown as abstract new (...args: unknown[]) => OpenAiClient; | ||
|
||
// Preserve static and prototype chains |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
l: I wonder if we can extract this into a util and re-use this, as I guess this is more boilerplate at this point. Nothing to do in this PR, this is fine, just maybe a future iteration improvement.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree. I think I'll add that to the refactor we're planning for shared utils and attributes for vercel AI and openai
dev-packages/node-integration-tests/suites/tracing/openai/scenario.mjs
Outdated
Show resolved
Hide resolved
packages/node/src/integrations/tracing/openai/instrumentation.ts
Outdated
Show resolved
Hide resolved
packages/node/src/integrations/tracing/openai/instrumentation.ts
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Bug: OpenAI Integration Logic and Data Handling Errors
The OpenAI instrumentation has three issues:
- The logic for
shouldRecordInputsAndOutputs
incorrectly defaults tofalse
if the OpenAI integration is not found, ignoring the client'ssendDefaultPii
setting. - The optional chaining for
client?.getOptions().sendDefaultPii
is incomplete, potentially causing aTypeError
ifgetOptions()
returnsundefined
. - In
addRequestAttributes
, bothparams.messages
andparams.input
set the sameGEN_AI_REQUEST_MESSAGES_ATTRIBUTE
, leading to data loss if both are present in the request.
packages/core/src/utils/openai/index.ts#L191-L204
sentry-javascript/packages/core/src/utils/openai/index.ts
Lines 191 to 204 in 86927ff
function addRequestAttributes(span: Span, params: Record<string, unknown>): void { | |
if ('messages' in params) { | |
span.setAttributes({ [GEN_AI_REQUEST_MESSAGES_ATTRIBUTE]: JSON.stringify(params.messages) }); | |
} | |
if ('input' in params) { | |
span.setAttributes({ [GEN_AI_REQUEST_MESSAGES_ATTRIBUTE]: JSON.stringify(params.input) }); | |
} | |
} | |
function getOptionsFromIntegration(): OpenAiOptions { | |
const scope = getCurrentScope(); | |
const client = scope.getClient(); | |
const integration = client?.getIntegrationByName(OPENAI_INTEGRATION_NAME) as OpenAiIntegration | undefined; | |
const shouldRecordInputsAndOutputs = integration ? Boolean(client?.getOptions().sendDefaultPii) : false; |
Was this report helpful? Give feedback by reacting with 👍 or 👎
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🚀 nice work!
This PR adds official support for instrumenting OpenAI SDK calls in Node with Sentry tracing, following OpenTelemetry semantic conventions for Generative AI.
We currently instrument the following OpenAI SDK methods:
Currently supported:
The openAIIntegration() accepts the following options:
Example: