How to set up Celery in Django

by Alex
How to set up Celery in Django

In this guide to using Celery in conjunction with Django, I will cover:

  1. How to set up Celery with Django.
  2. How to test Celery task in Django shell.
  3. Where to monitor the performance of a Celery application.

You can use on project source code from this repository.

Why a Django application needs Celery

Celery is needed to run tasks in a separateworker process, which allows you to immediately send an HTTP response to the user in the web process (even if the task in the worker process is still running). The request loop will not be blocked, which improves the user experience. Below are some examples of how to use Celery:

  • You’ve created an application with a comment feature where a user can use the @ symbol to mention another user, after which the latter will receive an email notification. If a user mentions 10 people in his comment, the web process needs to process and send 10 emails. Sometimes this takes a long time (network, server, and other factors). In this case, Celery can arrange to send emails in the background, which in turn will return an HTTP response to the user without waiting.
  • Need to create a thumbnail of an image uploaded by the user? Such a task is worth doing in a workflow.
  • You need to do something periodically, like generate a daily report, clean up data from an expired session. Use Celery to send tasks to your workflow at a designated time.

When you create a web application, try to make the response time no longer than 500ms (use New Relic or Scout APM), if the user waits too long for a response, find out the reason and try to fix it. Celery can help to solve such a problem.

Celery or RQ

RQ (Redis Queue) is another Python library that solves the above problems. The logic of RQ is similar to Celery (it uses a producer/consumer design pattern). In the following I will make a superficial comparison to better understand which tool is more suitable for the task.

  • RQ (Redis Queue) is easy to learn and aims to lower the barrier to using asynchronous workflow. It lacks some features and only works with Redis and Python.
  • Celery provides more features and supports many different server configurations. One of the downsides of this flexibility is the more complicated documentation, which quite often scares newcomers.

I prefer Celery because it is great for many problems. This article is written by me to help the reader (especially the newbie) learn Celery quickly!

Message Broker and Results Backend

A message broker is a repository that acts as a transport between producer and consumer. From the Celery documentation, the recommended broker is RabbitMQ because it supports AMQP (Advanced Message Queuing Protocol). Since in many cases we do not need to use AMQP, another queue manager such as Redis will also do. The results backend is a repository that contains information about the results of Celery tasks and the errors that occurred. This is where Redis is recommended.

How to configure Celery

Celery does not work on Windows. Use Linux or the Ubuntu terminal on Windows. Next I will show you how to import Celery worker into your Django project. We will use Redis as the message broker and backend of the results, which makes it a little easier. But you’re free to choose any other combination that meets your application’s requirements.

Use Docker to prepare your development environment

If you are on Linux or Mac you have the possibility to use the package manager to setup Redis (brew, apt-get install), however I would recommend you to try to use Docker to install the redis server.

  1. You can download the Docker client here.
  2. Then try running the Redis service $ docker run -p 6379: 6379 --name some-redis -d redis

The command above will start Redis on 127.0.0.1:6379.

  1. If you intend to use RabbitMQ as a message broker, you only need to change the above command.
  2. Once you’re done with your project, you can close the Docker container – your desktop environment will still be clean.

Now import Celery into our Django project.

Creating a Django project

I recommend creating a separate virtual environment and working in it.

$ pip install django==3.1
$ django-admin startproject celery_django
$ python manage.py startapp polls

Below is the structure of the project.

├── celery_django
│ ├── __init__.py
│ ├── asgi.py
│ ├── settings.py
│ ├── urls.py
│ └── wsgi.py
├─── manage.py
└─── polls
├─── __init__.py
├─── admin.py
├── apps.py
├─── migrations
│ └── __init__.py
├── models.py
├── tests.py
└── views.py

The celery.py file

Let’s start installing and configuring Celery.

pip install celery==4.4.7 redis==3.5.3 flower==0.9.7

Create the file celery_django/celery.py next to celery_django/wsgi.py.

"""
Celery settings file
https://docs.celeryproject.org/en/stable/django/first-steps-with-django.html
"""
from __future__ import absolute_import
import os
from celery import Celery
# this code is copied from manage.py
# it will set the Django default settings module for the 'celery' application.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'celery_django.settings')
# here you change the name
app = Celery('celery_django')
# To get Django settings, associate the "CELERY" prefix with the celery setting
app.config_from_object('django.conf:settings', namespace='CELERY')
# loading tasks.py into the django app
app.autodiscover_tasks()
@app.task
def add(x, y):
return x / y

File __init__.py

Let’s continue to modify the project, in celery_django/__init__.py add.

from __future__ import absolute_import, unicode_literals
# This will make sure that the app is always imported when Django is started
from .celery import app as celery_app
__all__ = ('celery_app',.)

Add settings.py

Since Celery can read the configuration from the Django settings file, we will make the following changes to it.

CELERY_BROKER_URL = "redis://127.0.0.1:6379/0"
CELERY_RESULT_BACKEND = "redis://127.0.0.1:6379/0"

There are a few things to keep in mind. If you study the Celery documentation, you’ll probably see that broker_url is the configuration key that you should set for the message manager, but in the above celery.py:

  1. app.config_from_object('django.conf: settings', namespace = 'CELERY') tells Celery to read the value from the CELERY namespace, so if you set just broker_url in your Django settings file, this parameter will not work. The rule applies to all configuration keys in the Celery documentation.
  2. Some configuration keys are different between Celery 3 and Celery 4, so please take a look at the documentation when setting it up.

Sending Celery Jobs

Once you’re done with the configuration, you’re all set to use Celery. We will run some commands in a separate terminal, but I recommend you take a look at Tmux when you have time. Run the Redis client first, then celery worker in another terminal, celery_django is the name of the Celery application you installed in celery_django/celery.py.

$ celery worker -A celery_django --loglevel=info
-------------- [email protected] v4.4.7 (cliffs)
--- ***** -----
-- ******* ---- Linux-4.4.0-19041-Microsoft-x86_64-with-glibc2.27 2021-03-15 15:03:44
- *** --- * ---
- ** ---------- [config]
- ** ---------- .> app: celery_django:0x7ff07f818ac0
- ** ---------- .> transport: redis://127.0.0.1:6379/0
- ** ---------- .> results: redis://127.0.0.1:6379/0
- *** --- * --- .> concurrency: 4 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
-------------- [queues]
.> celery exchange=celery(direct) key=celery
[tasks]
. celery_django.celery.add

Next, let’s run the application in the new terminal, which will help us keep track of the Celery task (I’ll talk about that a bit later).

$ flower -A celery_django --port=555
[I 210315 16:11:39 command:135] Visit me at http://localhost:5555
[I 210315 16:11:39:142] Broker: redis://127.0.0.1:6379/0
[I 210315 16:11:39:143] Registered tasks:
['celery.accumulate',
'celery.backend_cleanup',
'celery.chain',
'celery.chord',
'celery.chord_unlock',
'celery.chunks',
'celery.group',
'celery.map',
'celery.starmap',
'celery_django.celery.add']
[I 210315 16:11:39 mixins:229] Connected to redis://127.0.0.1:6379/0

Then open http://localhost:5555/. You should see a dashboard that shows you the details of the Celery workflow. Now let’s enter the Django shell and try to send Celery some tasks.

$ python manage.py migrate
$ python manage.py shell
...
>>> from celery_django.celery import add
>>> task = add.delay(1, 2)

Let’s take a look at a few things:

  1. We use xxx.delay to send a message to the broker. The worker process receives this task and executes it.
  2. When you press enter to enter task = add.delay(1, 2), it looks like the command completes quickly (no lock), but the add method is still active in the Celery workflow.
  3. If you check the terminal output where Celery was running, you will see something like this:
[2021-03-15 15:04:32,859: INFO/MainProcess] Received task: celery_django.celery.add[e1964774-fd3b-4add-96ff-116e3578de
de]
[2021-03-15 15:04:32,882: INFO/ForkPoolWorker-1] Task celery_django.celery.add[e1964774-fd3b-4add-96ff-116e3578ede] s
ucceeded in 0.01341869999999988348s: 0.5

The workflow received the task at 15:04:32, and it was successfully completed. I think you now have a basic understanding of how to use Celery. Let’s try to enter one more block of code.

>>> print(task.state, task.result)
SUCCESS 0.5

Then let’s try causing an error in the Celery worker and see what happens.

>>> task = add.delay(1, 0)
>>> type(task)
celery.result.AsyncResult
>>> task.state
'FAILURE'
>>> task.result
ZeroDivisionError('division by zero')

As you can see, the result of the delay method call is an instance of AsyncResult. We can use it as follows:

  1. Check state of task.
  2. Get returned value (result) or exception information.
  3. Get other metadata.

Monitoring Celery with Flower

Flower allows you to display information about Celery’s work in a much clearer way on a web page with a user-friendly interface. This makes it much easier to understand what’s going on, so I want to turn my attention to Flower before I delve further into Celery. The URL of the control panel is http://127.0.0.1:5555/. Open the Tasks page – Tasks. Как настроить Celery в Django When learning Celery, it is quite useful to use Flower to better understand the details. When you deploy your project on the server, Flower is not a mandatory component. What I mean by that is that you can use Celery commands directly to manage the application and check the status of the workflow.

Conclusion

In this article, I have covered the basic aspects of Celery. I hope that after reading it, you will have a better understanding of Celery. You can find the source code in the link at the beginning of this article.

Related Posts

LEAVE A COMMENT