[Answered ]-Django with Celery on Digital Ocean

1👍

Ultimately, the answer I uncovered is a compromise, a workaround using a different Digital Ocean (D.O.) product. The workaround was to use a Managed Database (which simplifies things but gives you much less control) rather than a Droplet (which involves manual Linux/Redis installation and configuration, but gives you greater control). This isn’t ideal for 2 reasons. First, it costs more ($6 vs $15 base cost). Second, I would have preferred to be able to work out how to manually setup Redis (and thus maintain greater control). However, I’ll take a working solution over no solution for a very niche issue.

The steps to use a D.O. Managed Redis DB are:

  • Provision the managed Redis DB
  • Use the Public Network Connection String (as the connection string includes the password I store this in an environment variable)
  • Ensure that you have the appropriate ssl setting in the ‘celery.py’ file (snippet below)

celery.py

import os

from celery import Celery
from ssl import CERT_NONE

os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj_name.settings')

app = Celery(
    'proj_name',
    broker_use_ssl={'ssl_cert_reqs': ssl.CERT_NONE},
    redis_backend_use_ssl={'ssl_cert_reqs': ssl.CERT_NONE}
)


app.config_from_object('django.conf:settings', namespace='CELERY')

app.autodiscover_tasks()


@app.task(bind=True)
def debug_task(self):
    print(f'Request: {self.request!r}')

settings.py

REDIS_URI = os.environ.get('REDIS_URI')
CELERY_BROKER_URL = f'{REDIS_URI}/0'
CELERY_RESULT_BACKEND = f'{REDIS_URI}/1'
CELERY_TASK_TRACK_STARTED = True
CELERY_TASK_TIME_LIMIT = 30 * 60
CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = TIME_ZONE

Leave a comment