Skip to content

Commit 80052ac

Browse files
committed
Merge branch 'bugfix' into master-into-bugfix/2.45.0-2.46.0-dev
2 parents 6a1bb84 + 08138f9 commit 80052ac

31 files changed

+3905
-243
lines changed

docs/content/en/connecting_your_tools/parsers/file/generic.md

Lines changed: 32 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -18,9 +18,13 @@ Attributes supported for CSV:
1818
- Verified: Indicator if the finding has been verified. Must be empty, TRUE, or FALSE
1919
- FalsePositive: Indicator if the finding is a false positive. Must be TRUE, or FALSE.
2020
- Duplicate:Indicator if the finding is a duplicate. Must be TRUE, or FALSE
21+
- IsMitigated: Indicator if the finding is mitigated. Must be TRUE, or FALSE
22+
- MitigatedDate: Date the finding was mitigated in mm/dd/yyyy format or ISO format
2123

2224
The CSV expects a header row with the names of the attributes.
2325

26+
Date fields are parsed using [dateutil.parse](https://dateutil.readthedocs.io/en/stable/parser.html) supporting a variety of formats such a YYYY-MM-DD or ISO-8601.
27+
2428
Example of JSON format:
2529

2630
```JSON
@@ -70,6 +74,34 @@ Example of JSON format:
7074
"cvssv3": "CVSS:3.1/AV:N/AC:L/PR:H/UI:R/S:C/C:L/I:L/A:N",
7175
"file_path": "src/threeeeeeeeee.cpp",
7276
"line": 1353
77+
},
78+
{
79+
"title": "test title mitigated",
80+
"description": "Some very long description with\n\n some UTF-8 chars à qu'il est beau2",
81+
"severity": "Critical",
82+
"mitigation": "Some mitigation",
83+
"date": "2021-01-06",
84+
"cve": "CVE-2020-36236",
85+
"cwe": 287,
86+
"cvssv3": "CVSS:3.1/AV:N/AC:L/PR:H/UI:R/S:C/C:L/I:L/A:N",
87+
"file_path": "src/threeeeeeeeee.cpp",
88+
"line": 1353,
89+
"is_mitigated": true,
90+
"mitigated": "2021-01-16"
91+
},
92+
{
93+
"title": "test title mitigated ISO",
94+
"description": "Some very long description with\n\n some UTF-8 chars à qu'il est beau2",
95+
"severity": "Critical",
96+
"mitigation": "Some mitigation",
97+
"date": "2024-01-04T11:02:11Z",
98+
"cve": "CVE-2020-36236",
99+
"cwe": 287,
100+
"cvssv3": "CVSS:3.1/AV:N/AC:L/PR:H/UI:R/S:C/C:L/I:L/A:N",
101+
"file_path": "src/threeeeeeeeee.cpp",
102+
"line": 1353,
103+
"is_mitigated": true,
104+
"mitigated": "2024-01-24T11:02:11Z"
73105
}
74106
]
75107
}
Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,8 @@
1+
---
2+
title: "Sysdig Vulnerability Reports"
3+
toc_hide: true
4+
---
5+
Import CSV report files generated by the [Sysdig CLI Scanner](https://docs.sysdig.com/en/sysdig-secure/install-agent-components/install-vulnerability-cli-scanner/)
6+
7+
### Sample Scan Data
8+
Sample Sysdig Vulnerability Reports scans can be found [here](https://github.com/DefectDojo/django-DefectDojo/tree/master/unittests/scans/sysdig_cli).

docs/content/en/connecting_your_tools/parsers/file/sysdig_reports.md

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -4,8 +4,7 @@ toc_hide: true
44
---
55
Import CSV report files from Sysdig or a Sysdig UI JSON Report
66
Parser will accept Pipeline, Registry and Runtime reports created from the UI
7-
8-
More information available at [our reporting docs page](https://docs.sysdig.com/en/docs/sysdig-secure/vulnerabilities/reporting)
7+
More information available at [sysdig reporting docs page](https://docs.sysdig.com/en/docs/sysdig-secure/vulnerabilities/reporting)
98

109
### Sample Scan Data
1110
Sample Sysdig Vulnerability Reports scans can be found [here](https://github.com/DefectDojo/django-DefectDojo/tree/master/unittests/scans/sysdig_reports).

docs/content/en/open_source/installation/running-in-production.md

Lines changed: 16 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -28,6 +28,18 @@ With a separate database, the minimum recommendations to run DefectDojo are:
2828
a different disk than your OS\'s for potential performance
2929
improvements.
3030

31+
### Security
32+
Verify the `nginx` configuration and other run-time aspects such as security headers to comply with your compliance requirements.
33+
Change the AES256 encryption key `&91a*agLqesc*0DJ+2*bAbsUZfR*4nLw` in `docker-compose.yml` to something unique for your instance.
34+
This encryption key is used to encrypt API keys and other credentials stored in Defect Dojo to connect to external tools such as SonarQube. A key can be generated in various ways for example using a password manager or `openssl`:
35+
36+
```
37+
openssl rand -base64 32
38+
```
39+
```
40+
DD_CREDENTIAL_AES_256_KEY: "${DD_CREDENTIAL_AES_256_KEY:-<PUT THE GENERATED KEY HERE>o}"
41+
```
42+
3143
## File Backup
3244

3345
In both cases (dedicated DB or containerized), if you are self-hosting, it is recommended that you implement and create periodic backups of your data.
@@ -55,7 +67,7 @@ concurrent connections.
5567

5668
### Celery worker
5769

58-
By default, a single mono-process celery worker is spawned. When storing a large amount of findings, leveraging async functions (like deduplication), or both. Eventually, it is important to adjust these parameters to prevent resource starvation.
70+
By default, a single mono-process celery worker is spawned. When storing a large amount of findings or running large imports it might be helpful to adjust these parameters to prevent resource starvation.
5971

6072
The following variables can be changed to increase worker performance, while keeping a single celery container.
6173

@@ -80,8 +92,8 @@ and see what is in effect.
8092

8193
<span style="background-color:rgba(242, 86, 29, 0.3)">This experimental feature has been deprecated as of DefectDojo 2.44.0 (March release). Please exercise caution if using this feature with an older version of DefectDojo, as results may be inconsistent.</span>
8294

83-
Import and Re-Import can also be configured to handle uploads asynchronously to aid in
84-
processing especially large scans. It works by batching Findings and Endpoints by a
95+
Import and Re-Import can also be configured to handle uploads asynchronously to aid in
96+
processing especially large scans. It works by batching Findings and Endpoints by a
8597
configurable amount. Each batch will be be processed in separate celery tasks.
8698

8799
The following variables impact async imports.
@@ -90,7 +102,7 @@ The following variables impact async imports.
90102
- `DD_ASYNC_FINDING_IMPORT_CHUNK_SIZE` defaults to 100
91103

92104
When using asynchronous imports with dynamic scanners, Endpoints will continue to "trickle" in
93-
even after the import has returned a successful response. This is because processing continues
105+
even after the import has returned a successful response. This is because processing continues
94106
to occur after the Findings have already been imported.
95107

96108
To determine if an import has been fully completed, please see the progress bar in the appropriate test.

0 commit comments

Comments
 (0)