Distributed runs with Celery
... or how you can 10x your scanning speed and massively parallelize your workflows.
Last updated
... or how you can 10x your scanning speed and massively parallelize your workflows.
Last updated
By default, secator
runs all tasks synchronously. This guide shows how to enable distributed runs using Celery workers, which unlocks .
To use distributed runs, make sure the worker
addon is installed:
This step is optional. If you do not configure a broker, the file system will be used as a broker and result backend. Note that this works only if the client and worker run on the same VM.
You can set up a task queue using Celery with the broker and a results backend of your choice, and run Celery workers to execute tasks from the broker queue.
The following is an example using redis
, but you can use any supported Celery broker and backend.
Install redis
addon:
Install redis
:
Start redis
and enable at boot:
Configure secator
to use Redis:
Make sure you replace <REDIS_IP>
in the variables above with the IP of your Redis server.
If you want to run synchronously (bypassing the broker), you can use the --sync
flag (CLI) or the sync
kwarg (Python).