Skip to content

feat(node): Add OpenAI integration #17022

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 9 commits into from
Jul 17, 2025
Merged

feat(node): Add OpenAI integration #17022

merged 9 commits into from
Jul 17, 2025

Conversation

RulaKhaled
Copy link
Member

@RulaKhaled RulaKhaled commented Jul 15, 2025

This PR adds official support for instrumenting OpenAI SDK calls in Node with Sentry tracing, following OpenTelemetry semantic conventions for Generative AI.

We currently instrument the following OpenAI SDK methods:

  • client.chat.completions.create() - For chat-based completions
  • client.responses.create() - For the responses API

Currently supported:

  • Node.js - Mentioned methods are supported in this PR
  • ESM and CJS - Both module systems are supported

The openAIIntegration() accepts the following options:

// The integration respects your sendDefaultPii client option
interface OpenAiOptions {
  recordInputs?: boolean;   // Whether to record prompt messages
  recordOutputs?: boolean;  // Whether to record response text
}

Example:

Sentry.init({
  dsn: '__DSN__',
  sendDefaultPii: false, // Even with PII disabled globally
  integrations: [
    Sentry.openAIIntegration({
      recordInputs: true,    // Force recording prompts
      recordOutputs: true,   // Force recording responses
    }),
  ],
});

Copy link
Contributor

github-actions bot commented Jul 15, 2025

size-limit report 📦

Path Size % Change Change
@sentry/browser 23.78 kB - -
@sentry/browser - with treeshaking flags 22.34 kB - -
@sentry/browser (incl. Tracing) 39.66 kB - -
@sentry/browser (incl. Tracing, Replay) 77.79 kB - -
@sentry/browser (incl. Tracing, Replay) - with treeshaking flags 67.59 kB - -
@sentry/browser (incl. Tracing, Replay with Canvas) 82.49 kB - -
@sentry/browser (incl. Tracing, Replay, Feedback) 94.59 kB - -
@sentry/browser (incl. Feedback) 40.48 kB - -
@sentry/browser (incl. sendFeedback) 28.46 kB - -
@sentry/browser (incl. FeedbackAsync) 33.36 kB - -
@sentry/react 25.54 kB - -
@sentry/react (incl. Tracing) 41.62 kB - -
@sentry/vue 28.23 kB - -
@sentry/vue (incl. Tracing) 41.45 kB - -
@sentry/svelte 23.81 kB - -
CDN Bundle 25.18 kB - -
CDN Bundle (incl. Tracing) 39.42 kB - -
CDN Bundle (incl. Tracing, Replay) 75.42 kB - -
CDN Bundle (incl. Tracing, Replay, Feedback) 80.89 kB - -
CDN Bundle - uncompressed 73.44 kB - -
CDN Bundle (incl. Tracing) - uncompressed 116.86 kB - -
CDN Bundle (incl. Tracing, Replay) - uncompressed 231 kB - -
CDN Bundle (incl. Tracing, Replay, Feedback) - uncompressed 243.81 kB - -
@sentry/nextjs (client) 43.66 kB - -
@sentry/sveltekit (client) 40.08 kB - -
@sentry/node 168.98 kB +0.7% +1.17 kB 🔺
@sentry/node - without tracing 100.2 kB - -
@sentry/aws-serverless 128.36 kB - -

View base workflow run

@RulaKhaled RulaKhaled changed the title [WIP] feat(node): Instrument openai chat and responses api feat(node): Add OpenAI integration Jul 16, 2025
@RulaKhaled RulaKhaled marked this pull request as ready for review July 16, 2025 10:47
cursor[bot]

This comment was marked as outdated.

Copy link
Member

@AbhiPrasad AbhiPrasad left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do the types and utils for openai have to live in @sentry/core? It's node only, I'd rather that they are in the node package.

The only thing that should be in core are the openai-attributes. I would prefer if we renamed that file to gen-ai-attributes.ts and made it more generic (the vercel ai attributes file can just re-export the attributes from gen-ai-attributes.ts).

We put the vercelai stuff in @sentry/core because it can also be used by the cloudflare and vercel-edge sdks.

@RulaKhaled
Copy link
Member Author

cloudflare and vercel-edge sdks

Do the types and utils for openai have to live in @sentry/core? It's node only, I'd rather that they are in the node package.

The only thing that should be in core are the openai-attributes. I would prefer if we renamed that file to gen-ai-attributes.ts and made it more generic (the vercel ai attributes file can just re-export the attributes from gen-ai-attributes.ts).

We put the vercelai stuff in @sentry/core because it can also be used by the cloudflare and vercel-edge sdks.

The types and utils are primarily used in openai.ts which is the main file we’ll use to instrument openai in both Vercel Edge and Cloudflare. We’re reusing the types defined in core within the Node integration, which is better than redefining them in Node (something i noticed we do for vercel ai), and potentially again in other places where we want to support openai.

Sure thing, I will rename it to gen-ai-attributes.ts so we can make use of it later

@AbhiPrasad
Copy link
Member

We’re reusing the types defined in core within the Node integration, which is better than redefining them in Node (something i noticed we do for vercel ai), and potentially again in other places where we want to support openai.

we're using import-in-the-middle to instrument this package, and openai sdk itself only works on the node runtime, so I think having it in core is optimizing too early. It'll take us some time to figure out non-node runtimes, once we do we can easily move stuff from node -> core. For now I'd like us to have the logic in the node sdk.

@RulaKhaled
Copy link
Member Author

We’re reusing the types defined in core within the Node integration, which is better than redefining them in Node (something i noticed we do for vercel ai), and potentially again in other places where we want to support openai.

we're using import-in-the-middle to instrument this package, and openai sdk itself only works on the node runtime, so I think having it in core is optimizing too early. It'll take us some time to figure out non-node runtimes, once we do we can easily move stuff from node -> core. For now I'd like us to have the logic in the node sdk.

Oh great call, didn't really know openai sdk itself only works on the node runtime, will move!

@andreiborza
Copy link
Member

@AbhiPrasad are you sure about the library not running on non-node runtimes? According to their README vercel-edge and cloudflare is supported.

We discussed on Monday in person that we'll start with this being a node instrumentation but that we want to have it available on vercel-edge and cloudflare eventually. We can't use import-in-the-middle, but we discussed that users might have to instrument their open ai client manually by calling a function we export in those runtimes.

@AbhiPrasad
Copy link
Member

@AbhiPrasad are you sure about the library not running on non-node runtimes? According to their README vercel-edge and cloudflare is supported.

This is my mistake! looks like with the latest major version they don't rely on node types for anything anymore (https://github.com/openai/openai-node/blob/2d0c448912f0adc1caa5052937971bc7bd6daaee/MIGRATION.md?plain=1#L23-L26). Sorry for the confusion.

I still think this should live in @sentry/node until we've figured out the alternative API to using import-in-the-middle. I would prefer to constrain the surface area.

@RulaKhaled
Copy link
Member Author

RulaKhaled commented Jul 17, 2025

@AbhiPrasad are you sure about the library not running on non-node runtimes? According to their README vercel-edge and cloudflare is supported.

This is my mistake! looks like with the latest major version they don't rely on node types for anything anymore (https://github.com/openai/openai-node/blob/2d0c448912f0adc1caa5052937971bc7bd6daaee/MIGRATION.md?plain=1#L23-L26). Sorry for the confusion.

I still think this should live in @sentry/node until we've figured out the alternative API to using import-in-the-middle. I would prefer to constrain the surface area.

Just a follow up on this, after syncing with Abhi, we have decided to keep things in core for now as we are aiming to tackle non-node environment very soon

});
} as unknown as abstract new (...args: unknown[]) => OpenAiClient;

// Preserve static and prototype chains
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

l: I wonder if we can extract this into a util and re-use this, as I guess this is more boilerplate at this point. Nothing to do in this PR, this is fine, just maybe a future iteration improvement.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree. I think I'll add that to the refactor we're planning for shared utils and attributes for vercel AI and openai

Copy link

@cursor cursor bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bug: OpenAI Integration Logic and Data Handling Errors

The OpenAI instrumentation has three issues:

  1. The logic for shouldRecordInputsAndOutputs incorrectly defaults to false if the OpenAI integration is not found, ignoring the client's sendDefaultPii setting.
  2. The optional chaining for client?.getOptions().sendDefaultPii is incomplete, potentially causing a TypeError if getOptions() returns undefined.
  3. In addRequestAttributes, both params.messages and params.input set the same GEN_AI_REQUEST_MESSAGES_ATTRIBUTE, leading to data loss if both are present in the request.

packages/core/src/utils/openai/index.ts#L191-L204

function addRequestAttributes(span: Span, params: Record<string, unknown>): void {
if ('messages' in params) {
span.setAttributes({ [GEN_AI_REQUEST_MESSAGES_ATTRIBUTE]: JSON.stringify(params.messages) });
}
if ('input' in params) {
span.setAttributes({ [GEN_AI_REQUEST_MESSAGES_ATTRIBUTE]: JSON.stringify(params.input) });
}
}
function getOptionsFromIntegration(): OpenAiOptions {
const scope = getCurrentScope();
const client = scope.getClient();
const integration = client?.getIntegrationByName(OPENAI_INTEGRATION_NAME) as OpenAiIntegration | undefined;
const shouldRecordInputsAndOutputs = integration ? Boolean(client?.getOptions().sendDefaultPii) : false;

Fix in CursorFix in Web


Was this report helpful? Give feedback by reacting with 👍 or 👎

Copy link
Member

@andreiborza andreiborza left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🚀 nice work!

@andreiborza andreiborza merged commit f538ef0 into develop Jul 17, 2025
650 of 659 checks passed
@andreiborza andreiborza deleted the openai-node branch July 17, 2025 11:34
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants