Skip to content

Docs: 3rd party log exporters #2058

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
May 15, 2025
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
97 changes: 79 additions & 18 deletions docs/config/config-file.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -127,47 +127,108 @@ There is a [huge library of instrumentations](https://opentelemetry.io/ecosystem

Some ones we recommend:

| Package | Description |
| --------------------------------------- | ------------------------------------------------------------------------------------------------------------------------ |
| `@opentelemetry/instrumentation-http` | Logs all HTTP calls |
| `@prisma/instrumentation` | Logs all Prisma calls, you need to [enable tracing](https://github.com/prisma/prisma/tree/main/packages/instrumentation) |
| `@traceloop/instrumentation-openai` | Logs all OpenAI calls |
| Package | Description |
| ------------------------------------- | ------------------------------------------------------------------------------------------------------------------------ |
| `@opentelemetry/instrumentation-http` | Logs all HTTP calls |
| `@prisma/instrumentation` | Logs all Prisma calls, you need to [enable tracing](https://github.com/prisma/prisma/tree/main/packages/instrumentation) |
| `@traceloop/instrumentation-openai` | Logs all OpenAI calls |

<Note>
`@opentelemetry/instrumentation-fs` which logs all file system calls is currently not supported.
</Note>

### Exporters
### Telemetry Exporters

You can also configure custom exporters to send your telemetry data to other services. For example, you can send your logs to [Axiom](https://axiom.co/docs/guides/opentelemetry-nodejs#exporter-instrumentation-ts):
You can also configure custom telemetry exporters to send your traces and logs to other external services. For example, you can send your logs to [Axiom](https://axiom.co/docs/guides/opentelemetry-nodejs#exporter-instrumentation-ts). First, add the opentelemetry exporter packages to your package.json file:

```json package.json
"dependencies": {
"@opentelemetry/exporter-logs-otlp-http": "0.52.1",
"@opentelemetry/exporter-trace-otlp-http": "0.52.1"
}
```

Then, configure the exporters in your `trigger.config.ts` file:

```ts trigger.config.ts
import { defineConfig } from "@trigger.dev/sdk/v3";
import { OTLPTraceExporter } from '@opentelemetry/exporter-trace-otlp-proto';

// Initialize OTLP trace exporter with the endpoint URL and headers
const axiomExporter = new OTLPTraceExporter({
url: 'https://api.axiom.co/v1/traces',
headers: {
'Authorization': `Bearer ${process.env.AXIOM_API_TOKEN}`,
'X-Axiom-Dataset': process.env.AXIOM_DATASET
},
});
import { OTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-http";
import { OTLPLogExporter } from "@opentelemetry/exporter-logs-otlp-http";

// Initialize OTLP trace exporter with the endpoint URL and headers;
export default defineConfig({
project: "<project ref>",
// Your other config settings...
telemetry: {
instrumentations: [
// Your instrumentations here
],
exporters: [axiomExporter],
logExporters: [
new OTLPLogExporter({
url: "https://api.axiom.co/v1/logs",
headers: {
Authorization: `Bearer ${process.env.AXIOM_API_TOKEN}`,
"X-Axiom-Dataset": process.env.AXIOM_DATASET,
},
}),
],
exporters: [
new OTLPTraceExporter({
url: "https://api.axiom.co/v1/traces",
headers: {
Authorization: `Bearer ${process.env.AXIOM_API_TOKEN}`,
"X-Axiom-Dataset": process.env.AXIOM_DATASET,
},
}),
],
},
});
```

Make sure to set the `AXIOM_API_TOKEN` and `AXIOM_DATASET` environment variables in your project.

<Note>
The `logExporters` option is available in the v4 beta SDK. See our [v4 upgrade
guide](/upgrade-to-v4) for more information.
</Note>

It's important to note that you cannot configure exporters using `OTEL_*` environment variables, as they would conflict with our internal telemetry. Instead you should configure the exporters via passing in arguments to the `OTLPTraceExporter` and `OTLPLogExporter` constructors. For example, here is how you can configure exporting to Honeycomb:

```ts trigger.config.ts
import { defineConfig } from "@trigger.dev/sdk/v3";
import { OTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-http";
import { OTLPLogExporter } from "@opentelemetry/exporter-logs-otlp-http";

// Initialize OTLP trace exporter with the endpoint URL and headers;
export default defineConfig({
project: "<project ref>",
// Your other config settings...
telemetry: {
instrumentations: [
// Your instrumentations here
],
logExporters: [
new OTLPLogExporter({
url: "https://api.honeycomb.io/v1/logs",
headers: {
"x-honeycomb-team": process.env.HONEYCOMB_API_KEY,
"x-honeycomb-dataset": process.env.HONEYCOMB_DATASET,
},
}),
],
exporters: [
new OTLPTraceExporter({
url: "https://api.honeycomb.io/v1/traces",
headers: {
"x-honeycomb-team": process.env.HONEYCOMB_API_KEY,
"x-honeycomb-dataset": process.env.HONEYCOMB_DATASET,
},
}),
],
},
});
```

## Runtime

We currently only officially support the `node` runtime, but you can try our experimental `bun` runtime by setting the `runtime` option in your config file:
Expand Down