😎 Scaling your Django App with Celery
Here, I will demonstrate how I integrated Celery into my Django app after reading many frustrating documentation pages. trust me you will find this is the easiest way
Requirements:
running django app template( if don't have clone it this repo )
celery
celery-beat
docker(optional) or redis-server
Celery in nutshell:
Celery is distributed task queue that helps in manage and execute the tasks in the background environments
to execute and receive task celery requires a message broker such as RabbitMQ, Redis. Celery worker nodes are used for offloading data-intensive processes to the background, making applications more efficient. Celery is highly available, and a single Celery worker can process millions of tasks a minute. As Celery workers perform critical tasks at scale, it is also important to monitor their performance, this worker continuously looks for tasks in the broker queue, picks a task and spin up a child process to process that task. we will be using redis as message broker because its easy and popular.
Celery beat is a scheduler, It kicks off tasks at regular intervals, that are then executed by available worker nodes in the cluster.
So let's install the celery and django_celery_beat
$ pip install celery django_celery_beat
Post installation of celery, django_celery_beat , let's do some configurations in django project
First, create a 'celery.py' file at the root of your project app. For illustration, if 'myapp' is your root project, then create 'celery.py' under the 'myapp' folder.
#celery.py
import os
from celery import Celery
from django.conf import settings
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "myapp.settings")
app = Celery("myapp")
app.config_from_object("django.conf:settings", namespace="CELERY")
app.autodiscover_tasks()
app.conf.beat_scheduler = 'django_celery_beat.schedulers:DatabaseScheduler'
@app.task(bind=True, ignore_result=True)
def debug_task(self):
print(f'Request: {self.request!r}')
In the above file, we have set some configurations that are important for the Celery app to run. We have used 'DatabaseScheduler' as the scheduler, which will store all schedules in the database. Alternatively, you can use 'PersistentScheduler,' which will create a file instead of storing in the database
make some changes in __init__.py
file as below
# myapp__init__.py
from .celery import app as celery_app
__all__ = ("celery_app")
now add some variables and installed app in settings.py
file
#settings.py
...
INSTALLED_APPS = [
...
'django_celery_beat'
]
...
CELERY_BROKER_URL = os.environ.get("CELERY_BROKER", "redis://127.0.0.1:6379/0")
CELERY_RESULT_BACKEND = os.environ.get("CELERY_BACKEND", "redis://127.0.0.1:6379/0")
CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = 'UTC'
CELERY_TASK_TRACK_STARTED = True
CELERY_TASK_TIME_LIMIT = 30 * 60
Post all the configuration ,migrate the changes using below command (Tables will be created to store tasks and schedules)
$ python
manage.py
migrate
Now, starting the Docker container will pull the Redis image and run your Redis container in detached mode on port 6379.
$ docker run -d -p 6379:6379 redis
OR else you can start using Redis-cli
if you don't have docker then you can simply download the Redis from it's official site
to start the redis-server $ sudo service redis-server restart
check if if its $ redis-cli ping
out will be $ PONG
All configurations are done, let's start the Celery and Beat to start celery worker node: $ celery -A <app-name> worker -l INFO
output:
start the beat for scheduler: $ celery -A <app-name> beat -l INFO
output:
OR
You can run both the worker and beat commands simultaneously using one command : $ celery -A <root-app=name> worker --beat -l info
Let's schedule a task using the Django admin panel. Add a new task under the 'Periodic Task' table
Specify the date and time you would like to schedule for the trigger
After clicking 'Save,' you will see some changes on the terminal as follows:
If you see this, congratulations! You've made it
Subscribe to my newsletter
Read articles from Parmeshwar Rathod directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
Parmeshwar Rathod
Parmeshwar Rathod
Passionate Node.js enthusiast and skilled backend developer, wielding MongoDB like a maestro. Crafting innovative web solutions that push the boundaries of technology. Always eager to embrace new challenges and share the journey of code. Let’s build the future together! #CodeNinja #BackendDev