Distributed task queue with Python using Celery and FastAPI

Andrea Capuano
5 min readJul 22, 2020

In this article we will use RabbitMQ and Celery in order to create a Distributed Task Queue. Moreover, we will take advantage of FastAPI to accept incoming requests and enqueue them on RabbitMQ.

A distributed task queue is a scalable architectural pattern and it’s widely used in production applications to ensure that large amount of messages/tasks are asynchronously consumed/processed by a pool of workers.

The following diagram, briefly explains what we will achieve in this article:

A new request arrives, it is ingested by the REST endpoint exposed through FastAPI. A message is produced and published to RabbitMQ, then a Celery worker will consume the message and run the task

Setting up the tools: RabbitMQ through Docker

Before diving in the code, we will setup RabbitMQ: an open source, production ready message broker. RabbitMQ will be used as a mean to distribute tasks among different workers.

To set it up, we will use docker and docker-compose, in this way we can keep some isolation and retain some lift and shift properties thanks to containerization.

Let’s get the latest version fo RabbitMQ docker image from DockerHub, by issuing the following command in your terminal:

--

--

Andrea Capuano

Software Engineering, Artificial Intelligence, Random thoughts