Skip to content

Commit 5bf5c71

Browse files
lauraschauercopernico
authored andcommitted
adds readme
1 parent c1807de commit 5bf5c71

File tree

3 files changed

+52
-3
lines changed

3 files changed

+52
-3
lines changed

prospector/pipeline/README.md

Lines changed: 49 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,49 @@
1+
# Pipeline Usage of Prospector
2+
3+
4+
The pipeline works in the following way:
5+
6+
1. `get_cve_data()` of `filter_entries.py` first fetches the most recent CVEs' raw data.
7+
2. This raw data get saved to the `vulnerability` table of the database.
8+
3. Then this raw vulnerability data gets fetched from the database and filtered (`process_cve_data()` of `filter_entries.py`)
9+
4. For each filtered CVE, a job (essentially the Prospector function and the report generation function) is created and enqueued in the Redis Queue using `enqueue_jobs()` from `job_creation.py`.
10+
11+
## Use the Pipeline
12+
13+
For the pipeline to work, first run
14+
15+
```bash
16+
make docker-setup
17+
```
18+
19+
to create the following five containers:
20+
21+
```bash
22+
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
23+
77e4b01ada4d prospector_backend "python ./service/ma…" 58 minutes ago Up 58 minutes 0.0.0.0:8000->8000/tcp, :::8000->8000/tcp prospector_backend_1
24+
57a30c903a9a prospector_worker "/usr/local/bin/star…" 58 minutes ago Up 58 minutes prospector_worker_1
25+
2ea00e47ac71 redis:alpine "docker-entrypoint.s…" 58 minutes ago Up 58 minutes 0.0.0.0:6379->6379/tcp, :::6379->6379/tcp prospector_redis_1
26+
120d3502ee51 postgres "docker-entrypoint.s…" 58 minutes ago Up 58 minutes 0.0.0.0:5432->5432/tcp, :::5432->5432/tcp db
27+
1d9acef24637 adminer "entrypoint.sh php -…" 58 minutes ago Up 58 minutes 0.0.0.0:8080->8080/tcp, :::8080->8080/tcp prospector_adminer_1
28+
```
29+
30+
Then enqueue the latest CVEs as jobs by running `python3 pipeline/main.py`.
31+
32+
### Increase the number of workers
33+
34+
Adjust the number of workers in `etc_supervisor_confd_rqworker.conf.j2`:
35+
36+
```bash
37+
...
38+
numprocs=2
39+
...
40+
```
41+
42+
## Observe Pipeline
43+
44+
View the database on `localhost:8080`.
45+
46+
View the fetched vulnerabilities and generated reports on `localhost:8000`.
47+
48+
View worker output in the terminal by running `docker attach prospector_worker_1` or the output in `prospector.log` (even though this can be difficult to read with more than 1 worker, because the logging gets all mixed up between workers).
49+

prospector/pipeline/job_creation.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -213,6 +213,6 @@ async def enqueue_jobs(reports_filepath: str, creator: str = "Auto"):
213213
db.disconnect()
214214

215215
console.print(
216-
f"\n\tEnqueueing finished",
216+
"\n\tEnqueueing finished",
217217
status=MessageStatus.OK,
218218
)

prospector/pipeline/main.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -22,14 +22,14 @@ async def dispatch_jobs():
2222
save_cves_to_db(cve_data)
2323

2424
# get entry from db and process
25-
processed_cves = await process_cve_data()
25+
_ = await process_cve_data()
2626

2727
await enqueue_jobs(reports_filepath="pipeline/reports/")
2828

2929

3030
async def main():
3131
"""Starting point to enqueue jobs into the pipeline"""
32-
ConsoleWriter.print(f"Starting pipeline\n", status=MessageStatus.OK)
32+
ConsoleWriter.print("Starting pipeline\n", status=MessageStatus.OK)
3333
await dispatch_jobs()
3434

3535

0 commit comments

Comments
 (0)