Distributed runs with Celery
... or how you can 10x your scanning speed and massively parallelize your workflows.
Prerequisite
To use distributed runs, make sure the worker
addon is installed:
Step 1: Configure a broker [optional]
This step is optional. If you do not configure a broker, the file system will be used as a broker and result backend. Note that this works only if the client and worker run on the same VM.
You can set up a task queue using Celery with the broker and a results backend of your choice, and run Celery workers to execute tasks from the broker queue.
The following is an example using redis
, but you can use any supported Celery broker and backend.
Install redis
addon:
Install redis
:
Start redis
and enable at boot:
Configure secator
to use Redis:
Make sure you replace <REDIS_IP>
in the variables above with the IP of your Redis server.
Step 2: Start a Celery worker
Step 3: Run a task, workflow or scan
If you want to run synchronously (bypassing the broker), you can use the --sync
flag (CLI) or the sync
kwarg (Python).
Last updated