How to Cache Using Redis in Django Applications

Published on Aug. 22, 2023, 12:12 p.m.

Cacheing gets really messy quick. Best practices on using Redis and caching.

What is Redis?

Redis is a memory data structure store. It is used as a database, cache, and message broker. It is a simple key value database store with faster execution time. It stores all in primary memory.

install Django

How to integrate Redis with my second favourite web framework, Django.
Let’s set up Python environment and install Django:

mkdir django-redis-demo
cd django-redis-demo
python3 -m venv venv
source venv/bin/activate

pip install django

Once we have that done, let’s initialize our project named core .

django-admin startproject core .
python manage.py startapp app

Basic Models

# app/models.py

from django.db import models

class SampleModel(models.Model):
    title = models.CharField(max_length=200)

    def __str__(self) -> str:
        return self.title

Don’t forget to add our app to “installed_apps .

# core/settings.py

INSTALLED_APPS = [
    # default django apps...

    'app'
]
python manage.py makemigrations
python manage.py migrate

If we run the server, everything is working fine .

python manage.py runserver

Dockerizing our Django Project

If you have Redis already installed locally, feel free to skip this whole step.

If you don’t have Redis installed or don’t want to install it locally, than you should.Then the next best thing would be to create a container of it.

To start, let’s dockerize our Django app.

Create a Dockerfile in Django project root .

# Dockerfile
FROM python:3
ENV PYTHONUNBUFFERED=1
WORKDIR /usr/src/app
COPY requirements.txt ./
RUN pip install -r requirements.txt 

This simply creates a python container with our Django project .

Next, create a docker-compose.yml file in Django project root .

version: "3.8"

services:
    django:
        build: .
        container_name: django
        command: python manage.py runserver 0.0.0.0:8000
        volumes:
            - .:/usr/src/app/
        ports:
            - "8000:8000"
        depends_on:
            - redis
    redis:
        image: "redis:alpine"

Basically , we have two services:

One service runs on localhost:8000.

One service for Redis that runs by default on port 6379.

Finally, run your service using the command:

docker-compose up 

Let’s go and create a basic API.

Views

# app/views.py
from django.http import JsonResponse
from django.core import serializers

from app.models import SampleModel

def sample(request):
    objs = SampleModel.objects.all()
    json = serializers.serialize('json', objs)
    return JsonResponse(json, safe=False)

This is a pretty simple function.

In our core/urls.py .

# app/views.py

from django.contrib import admin
from app.views import sample

urlpatterns = [
    path('admin/', admin.site.urls),
    path('sample', sample),
]

If we go to localhost:8000/sample we should see an empty array.

Let’s make another route that uses cache.

Integrating Redis with Django

You can literally hook up Redis using one line of code and then use the Django cache package.

First install a package called django-redis.

pip install django-redis

Don’t forget to recreate requirements.txt.

pip freeze > requirements.txt
docker-compose build --no-cache

Once you have that installed and up running, we should add cache configs.

# core/settings.py 

# other settings....

CACHES = {
        "default": {
            "BACKEND": "django_redis.cache.RedisCache",
            "LOCATION": "redis://redis:6379/",
            "OPTIONS": {
                "CLIENT_CLASS": "django_redis.client.DefaultClient"
            },
        }
    }

Here we are basically saying that we want to use a cache that will be Redis.

The other thing is our Redis service is at redis:6379.

We are finished, this is the all configuration we need .The beauty with this approach is that if we want to change the service in the future to let’s say Memcached.

Cached View

Now that we have our cache setup.

Let’s add a cached view to the app/views.py:

# app/views.py

# other code.....

from django.core.cache import cache
from django.conf import settings
from django.core.cache.backends.base import DEFAULT_TIMEOUT

CACHE_TTL = getattr(settings, 'CACHE_TTL', DEFAULT_TIMEOUT)

def cached_sample(request):
    if 'sample' in cache:
        json = cache.get('sample')
        return JsonResponse(json, safe=False)
    else:
        objs = SampleModel.objects.all()
        json = serializers.serialize('json', objs)
        # store data in cache
        cache.set('sample', json, timeout=CACHE_TTL)
        return JsonResponse(json, safe=False)

We first set a Cache Time to Live variable to Live variable.We can customize it by changing the variable CACHE_TTL in our django settings.

The actual view is pretty simple, we check if a variable named sample exists in cache.If not then cache it.

Don’t forget to add this view to our urls.

# core/urls.py

from django.contrib import admin
from app.views import sample, cached_sample

urlpatterns = [
    path('admin/', admin.site.urls),
    path('sample', sample),
    path('cache', cached_sample),
]

How to use Redis cache in Django REST framework?

Django REST framework is a powerful and flexible toolkit.The Web browsable API is a great user experience win for your developers.

Add cache as a method decorator to the ModelViewSet
You can work with your django and Django REST Framework cache in many different ways.

from django.utils.decorators import method_decorator
from django.views.decorators.cache import cache_page
from django.views.decorators.vary import vary_on_cookie
from rest_framework.viewsets import ModelViewSet

class NameofViewSet(ModelViewSet):
   serializer_class = NameofSerializer
   queryset = NameofModel.objects.all()
   lookup_field = 'name_of_lookup_field'
   ### and more..

   @method_decorator(vary_on_cookie)
   @method_decorator(cache_page(60*60))
   def dispatch(self, *args, **kwargs):
      return super(NameofViewSet, self).dispatch(*args, **kwargs)

Testing

To conduct a load test.I will be using a npm package called loadtest.

You can install loadtest globally.

sudo npm install -g loadtest

Run the command:

loadtest -n 100 -k http://localhost:8000/sample

This should bring up a bunch of results.

INFO Requests per second: 34

This means our API can handle only 34 requests per second on average.

Let’s test the caching API.

loadtest -n 100 -k http://localhost:8000/cache

The result is:

Requests per second: 55

The result is much better.

Congratulations, you got your first taste.