Django async tasks configuration

Web applications have always time consuming operations like send emails, prepare a report, make bussiness logic operations and many other. By making all operations sync will make the user wait dead times to receive a response and our application will feel slow.

Creating async operations our application will respond faster to the user by responding a message or a view and in background processing the required operation.

How to do that?

Celery and [Redis / RabbitMQ]

Django has different ways to make it! One way is by using Celery wich is a distributed tasks queue with Redis or RabbitMQ. This makes your system work like this:

DjangoCeleryRedis

Docker configuration

version: '2'
services:
  postgres:
    image: postgres:9.5
    env_file: .env

  django:
    restart: always
    build: ./django
    links:
      - postgres:postgres
      - redis:redis
    depends_on:
      - postgres
    env_file: .env
    ports:
    - "8081:8081"
    command: ./run_django.sh

  celery:
    restart: always
    build: ./django
    links:
      - redis:redis
    env_file: .env
    command: ./run_celery.sh

  redis:
    restart: always
    image: redis:4.0.6

Celery config

celery.py

from __future__ import absolute_import
import os
from celery import Celery
from django.conf import settings

os.environ.setdefault("DJANGO_SETTINGS_MODULE", "myproject.settings.prd")

app = Celery('myproject')

# Using a string here means the worker will not have to
# pickle the object when using Windows.
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)

In settings.py add

# CELERY SETTINGS
BROKER_URL = 'redis://redis:6379/0'
CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'

Define tasks for your apps.

By convention add a tasks.py file in your apps to store all async operations and follow this example.

from my_project.celery import app


@app.task
def my_heavy_process(var1, var2):
  # Bussiness Logic!
  # Send emails
  # Generate report
  # Grab lot informations and store result

Call your tasks!

from my_app.tasks import my_heavy_process


class MyView(View):
  # bussiness logic here
  my_heavy_process.delay(var1, var2)
  # bussiness logic here
  # response

Adventages:

  1. Easy configuration with docker.
  2. Using RabbitMQ could store uncompleted jobs and recover from failure easiy.

Disadvantages

  1. Required more dockers/servers.
  2. Required more devops maintenance

Asyncio

There is another way to make async operations in Django using Python3.5+ asyncio to leave some operations in the asyncio event loop.

import asyncio

loop = asyncio.get_event_loop()


def heavy_operation(args):
  pass


class MyView(View):
  # bussiness logic here
  arguments = ['var1', 'var2']
  loop.run_in_executor(None, heavy_operation, arguments)
  # bussiness logic here
  # response

Adventages:

  1. Easy to configure.
  2. No require extra docker/server.
  3. No require extre maintenance.

Disadvantages

  1. Uncompleted jobs will be lost if service restarts.
  2. Only works in python 3.5+ [Require upgrade python/django for old projects]

Final thoughts

Our applications must respond fast! This is why is required to have this option in consideration to handle heavy operations. Take your decision and choose the option you like most.

Here some useful links!