You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: src/platforms/python/guides/celery/index.mdx
+22-17Lines changed: 22 additions & 17 deletions
Original file line number
Diff line number
Diff line change
@@ -8,34 +8,34 @@ description: "Learn about using Sentry with Celery."
8
8
9
9
The Celery integration adds support for the [Celery Task Queue System](https://docs.celeryq.dev/).
10
10
11
-
Just add `CeleryIntegration()` to your `integrations` list:
11
+
## Install
12
+
13
+
Install `sentry-sdk` from PyPI with the `celery` extra:
14
+
15
+
```bash
16
+
pip install --upgrade 'sentry-sdk[celery]'
17
+
```
18
+
19
+
## Configure
20
+
21
+
If you have the `celery` package in your dependencies, the Celery integration will be enabled automatically when you initialize the Sentry SDK.
22
+
23
+
Make sure that the **call to `init` is loaded on worker startup**, and not only in the module where your tasks are defined. Otherwise, the initialization happens too late and events might end up not being reported.
12
24
13
25
<SignInNote />
14
26
15
27
```python
16
28
import sentry_sdk
17
-
from sentry_sdk.integrations.celery import CeleryIntegration
18
29
19
30
sentry_sdk.init(
20
31
dsn='___PUBLIC_DSN___',
21
-
integrations=[
22
-
CeleryIntegration(),
23
-
],
24
-
25
32
# Set traces_sample_rate to 1.0 to capture 100%
26
33
# of transactions for performance monitoring.
27
-
# We recommend adjusting this value in production,
28
34
traces_sample_rate=1.0,
29
35
)
30
36
```
31
37
32
-
Additionally, the Sentry Python SDK will set the transaction on the event to the task name, and it will improve the grouping for global Celery errors such as timeouts.
33
-
34
-
The integration will automatically report errors from all celery jobs.
35
-
36
-
Generally, make sure that the **call to `init` is loaded on worker startup**, and not only in the module where your tasks are defined. Otherwise, the initialization happens too late and events might end up not being reported.
37
-
38
-
## Standalone Setup
38
+
### Standalone Setup
39
39
40
40
If you're using Celery standalone, there are two ways to set this up:
41
41
@@ -51,16 +51,16 @@ If you're using Celery standalone, there are two ways to set this up:
51
51
#@signals.worker_init.connect
52
52
@signals.celeryd_init.connect
53
53
definit_sentry(**_kwargs):
54
-
sentry_sdk.init(dsn="...")
54
+
sentry_sdk.init(...) # same as above
55
55
```
56
56
57
-
## Setup With Django
57
+
###Setup With Django
58
58
59
59
If you're using Celery with Django in a conventional setup, have already initialized the SDK in [your `settings.py` file](/platforms/python/guides/django/#configure), and have Celery using the same settings with [`config_from_object`](https://docs.celeryq.dev/en/stable/django/first-steps-with-django.html), you don't need to initialize the SDK separately for Celery.
60
60
61
61
## Verify
62
62
63
-
To verify if your SDK is initialized on worker start, you can pass `debug=True` to see extra output when the SDK is initialized. If the output appears during worker startup and not only after a task has started, then it's working properly.
63
+
To verify if your SDK is initialized on worker start, you can pass `debug=True` to `sentry_sdk.init()` to see extra output when the SDK is initialized. If the output appears during worker startup and not only after a task has started, then it's working properly.
64
64
65
65
<Alertlevel="info"title="Note on distributed tracing">
66
66
@@ -156,3 +156,8 @@ my_task_b.apply_async(
156
156
157
157
# Note: overriding the tracing behaviour using `task_x.delay()` is not possible.
0 commit comments