By default, Celery routes all tasks to a single queue and all workers consume from this default queue. You can change this behaviour by telling Celery which tasks to send to which queues. This is known as task routing.
This is useful if you have slow and fast tasks, and you don't want the slow tasks to interfere with the fast tasks. Or you have a gevent pool worker for IO-bound tasks and a prefork pool worker for CPU-heavy tasks.
Step 1: Configure task_routes
The first thing is to assign a queue to each task. For example to route task_1
to queue_a
and task_2
to queue_b
:
from celery import Celery
app = Celery(
__name__,
broker="redis://localhost:6379/0",
task_routes={
"task_1": {"queue": "queue_a"}
"task_2": {"queue": "queue_b"}
})
Step 2: Worker queues
The --queues
command line argument makes the Celery worker process tasks from one or multiple queues and ignore everything else. For the example above, I have one worker for queue_a
...
# process tasks in queue_a only
$ celery --app=worker.app worker --queues=queue_a
...and another worker processing tasks from queue_b
:
# process tasks in queue_b only
$ celery --app=worker.app worker --queues=queue_b
Bonus: Multiple queues
If you want your worker to process tasks from more than one queue, pass a comma-separated list to the --queues
argument:
$ celery --app=worker.app worker --queues=queue_a,queue_b
Step 3: Give it a go
You can find the complete example code on GitHub. Clone the repository, create a virtual environment and install the dependencies:
$ git clone https://github.com/bstiel/celery-task-routing.git
$ cd celery-task-routing
$ python -m venv venv
$ source venv/bin/activate
$ pip install -r requirements.txt
Start the Redis message broker:
$ docker compose up -d
Start the two Celery workers and the producer, to produce task_1
and task_2
once a second, inside a look. All three commands are wrapped in a Procfile that you can start via honcho (or foreman):
$ honcho start
You will see worker_1
processing task_1
only, and worker_2
processing task_2
only.
11:42:16 worker_1.1 [2023-12-20 11:42:16,904: WARNING/MainProcess] hello from task_1
11:42:16 worker_2.1 [2023-12-20 11:42:16,913: WARNING/MainProcess] hello from task_2
Summary
In this blog post, you learned how to configure Celery to route tasks to dedicated queues and make Celery workers process tasks from certain queues only.
To achieve that, you need to map tasks to queues. This approach works as long as you only have a limited number of tasks. For more complex setups involving many tasks, queues and even services, dynamic task routing offers a more scalable and maintainable solution.
Last updated Dec 19, 2023
First published May 29, 2018