Join Login Create a Request

Python Background Tasks with Django

Python Background Tasks with Django

Running periodic tasks in the background is a feature that can be challenging for developers. After completing this post you will be able to create your own custom django background worker/schedule.

Are periodic tasks useful?.

Imagine you're using Django as a content aggregator, and you want to auto fetch blog posts from multiple websites every 10 minutes. This is where Asynchronous Tasks come into play. You can write a function that will be run periodically without any human intervention needed.

In this post you'll learn about three technologies/tools:

  1. Celery
  2. Django
  3. Redis

What is Celery

It is an asynchronous task queue and is based on distributed message passing. Celery can be used for scheduling tasks and that's what we'll be using in this post.

What is Redis

Redis is an in-memory data structure store. It can be used as a caching engine. Above being superior for caching, it is also very simple to configure with Django.

In this project, however, we're going to use Redis as a message broker. Celery uses brokers to pass messages between Django and the workers we're going to create.

A message broker is a computer program that translates a message from the sender's formal messaging protocol to the receiver's formal messaging protocol

The Project Setup

I'm going to assume you have already installed Django and it is running, this is just for Redis and celery setup. But first, install Redis from the Redis official page.

After downloading redis start the redis server by running: $ redis-server in your terminal. To test if the Redis server is working properly, run $ redis-cli ping and  it should reply with PONG.

Open your terminal and run the following commands to install celery, Redis and the other python dependencies we'll need.

pip install celery
pip install redis
pip install django-celery-beat
pip install django-redis

Add Celery and Redis to your

After you have installed Redis, if it is working, add the following code to your file

# add django_celery_beat, helps you view and manage your tasks form the django admin panel
    # ---------------all the other apps -----------

BROKER_URL = 'redis://localhost:6379'
CELERY_RESULT_BACKEND = 'redis://localhost:6379'
CELERY_ACCEPT_CONTENT = ['application/json']

# Redis Caching (Not related to this post but you can go ahead and just congigure your caching as well)
    "default": {
        "BACKEND": "django_redis.cache.RedisCache",
        "LOCATION": "redis://localhost:6379/1",
        "OPTIONS": {
            "CLIENT_CLASS": "django_redis.client.DefaultClient"
        "KEY_PREFIX": "napi"

CACHE_TTL = 60 * 15

After you have edited your run $ python migrate

Create a Task in your Django app (

from celery import shared_task
import requests

# get hacker news top stories
def get_hackernews_json():
    url = ""
        r = requests.get(url)
        return r.json()
    except requests.exceptions.RequestException as err:
        print("OOps: Something Else", err)
    except requests.exceptions.HTTPError as errh:
        print("Http Error:", errh)
    except requests.exceptions.ConnectionError as errc:
        print("Error Connecting:", errc)
    except requests.exceptions.Timeout as errt:
        print("Timeout Error:", errt)

# get hacker news top stories in detail
def update_hackernews_news():
    json = get_hackernews_json()
    if json is not None:
        for story in json:
            url = '{}.json'.format(
                r = requests.get(url)
                item = r.json()
                news_obj = item
            except requests.exceptions.RequestException as err:
                print("OOps: Something Else", err)

# create your celery task
def update_hackernews():

This is the task that will be scheduled to run.

Configuring the Scheduler

Now, Create a file inside the root project directory, the Same directory as And paste the following code.

from __future__ import absolute_import
import os
from celery import Celery
from celery.schedules import crontab

#set the default Django settings module for the 'celery' program (replace api.settings with the location of your project settings)
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'api.settings')
app = Celery('api')

# Using a string here means the worker will not have to
# pickle the object when using Windows.

def debug_task(self):
    print('Request: {0!r}'.format(self.request))

# create a 5 minutes schedule for your hacker new task
app.conf.beat_schedule = {
    'update_hackernews_task': {
        'task': 'update_hackernews',
        'schedule': crontab(minute="*/5")

Run the Project

Now, a moment of truth, open your terminal and run the following command (replace the API app name with yours in

celery -A api worker -l info
celery -A api beat -l info

To run the project remotely, on a live server, you need something like a supervisor and for that process, you need to follow the instructions provided here and run celery tasks remotely with a supervisor.


Recent Posts

How to use robot.txt on a Django Website
How to use robot.txt on a Django Website
How to add a robots.txt file on a Django website so that you..
Read More
Top sites for remote jobs
Top sites for remote jobs
Where to find remote jobs for software developers in 2022...
Read More
Project ideas for a software developer portfolio
Project ideas for a software developer portfolio
Projects which can strengthen your portfolio as a software d..
Read More
My case for WordPress
My case for WordPress
WordPress strengths and weaknesses. When you should use Word..
Read More