[Django]-Set up a scheduled job?

405๐Ÿ‘

โœ…

One solution that I have employed is to do this:

1) Create a custom management command, e.g.

python manage.py my_cool_command

2) Use cron (on Linux) or at (on Windows) to run my command at the required times.

This is a simple solution that doesnโ€™t require installing a heavy AMQP stack. However there are nice advantages to using something like Celery, mentioned in the other answers. In particular, with Celery it is nice to not have to spread your application logic out into crontab files. However the cron solution works quite nicely for a small to medium sized application and where you donโ€™t want a lot of external dependencies.

EDIT:

In later version of windows the at command is deprecated for Windows 8, Server 2012 and above. You can use schtasks.exe for same use.

**** UPDATE ****
This the new link of django doc for writing the custom management command

๐Ÿ‘คBrian Neal

172๐Ÿ‘

Celery is a distributed task queue, built on AMQP (RabbitMQ). It also handles periodic tasks in a cron-like fashion (see periodic tasks). Depending on your app, it might be worth a gander.

Celery is pretty easy to set up with django (docs), and periodic tasks will actually skip missed tasks in case of a downtime. Celery also has built-in retry mechanisms, in case a task fails.

๐Ÿ‘คdln

55๐Ÿ‘

Weโ€™ve open-sourced what I think is a structured app. that Brianโ€™s solution above alludes too. We would love any / all feedback!

https://github.com/tivix/django-cron

It comes with one management command:

./manage.py runcrons

That does the job. Each cron is modeled as a class (so its all OO) and each cron runs at a different frequency and we make sure the same cron type doesnโ€™t run in parallel (in case crons themselves take longer time to run than their frequency!)

๐Ÿ‘คchachra

37๐Ÿ‘

If youโ€™re using a standard POSIX OS, you use cron.

If youโ€™re using Windows, you use at.

Write a Django management command to

  1. Figure out what platform theyโ€™re on.

  2. Either execute the appropriate โ€œATโ€ command for your users, or update the crontab for your users.

๐Ÿ‘คS.Lott

23๐Ÿ‘

Interesting new pluggable Django app: django-chronograph

You only have to add one cron entry which acts as a timer, and you have a very nice Django admin interface into the scripts to run.

๐Ÿ‘คVan Gale

16๐Ÿ‘

Look at Django Poor Manโ€™s Cron which is a Django app that makes use of spambots, search engine indexing robots and alike to run scheduled tasks in approximately regular intervals

See: http://code.google.com/p/django-poormanscron/

๐Ÿ‘คuser41767

15๐Ÿ‘

I had exactly the same requirement a while ago, and ended up solving it using APScheduler (User Guide)

It makes scheduling jobs super simple, and keeps it independent for from request-based execution of some code. Following is a simple example.

from apscheduler.schedulers.background import BackgroundScheduler

scheduler = BackgroundScheduler()
job = None

def tick():
    print('One tick!')\

def start_job():
    global job
    job = scheduler.add_job(tick, 'interval', seconds=3600)
    try:
        scheduler.start()
    except:
        pass

Hope this helps somebody!

๐Ÿ‘คPhoenixDev

11๐Ÿ‘

Brian Nealโ€™s suggestion of running management commands via cron works well, but if youโ€™re looking for something a little more robust (yet not as elaborate as Celery) Iโ€™d look into a library like Kronos:

# app/cron.py

import kronos

@kronos.register('0 * * * *')
def task():
    pass
๐Ÿ‘คJohannes Gorset

11๐Ÿ‘

Django APScheduler for Scheduler Jobs. Advanced Python Scheduler (APScheduler) is a Python library that lets you schedule your Python code to be executed later, either just once or periodically. You can add new jobs or remove old ones on the fly as you please.

note: Iโ€™m the author of this library

Install APScheduler

pip install apscheduler

View file function to call

file name: scheduler_jobs.py

def FirstCronTest():
    print("")
    print("I am executed..!")

Configuring the scheduler

make execute.py file and add the below codes

from apscheduler.schedulers.background import BackgroundScheduler
scheduler = BackgroundScheduler()

Your written functions Here, the scheduler functions are written in scheduler_jobs

import scheduler_jobs 

scheduler.add_job(scheduler_jobs.FirstCronTest, 'interval', seconds=10)
scheduler.start()

Link the File for Execution

Now, add the below line in the bottom of Url file

import execute
๐Ÿ‘คChandan Sharma

10๐Ÿ‘

RabbitMQ and Celery have more features and task handling capabilities than Cron. If task failure isnโ€™t an issue, and you think you will handle broken tasks in the next call, then Cron is sufficient.

Celery & AMQP will let you handle the broken task, and it will get executed again by another worker (Celery workers listen for the next task to work on), until the taskโ€™s max_retries attribute is reached. You can even invoke tasks on failure, like logging the failure, or sending an email to the admin once the max_retries has been reached.

And you can distribute Celery and AMQP servers when you need to scale your application.

๐Ÿ‘คRavi Kumar

8๐Ÿ‘

I personally use cron, but the Jobs Scheduling parts of django-extensions looks interesting.

๐Ÿ‘คVan Gale

8๐Ÿ‘

Although not part of Django, Airflow is a more recent project (as of 2016) that is useful for task management.

Airflow is a workflow automation and scheduling system that can be used to author and manage data pipelines. A web-based UI provides the developer with a range of options for managing and viewing these pipelines.

Airflow is written in Python and is built using Flask.

Airflow was created by Maxime Beauchemin at Airbnb and open sourced in the spring of 2015. It joined the Apache Software Foundationโ€™s incubation program in the winter of 2016. Here is the Git project page and some addition background information.

๐Ÿ‘คAlexander

6๐Ÿ‘

Put the following at the top of your cron.py file:

#!/usr/bin/python
import os, sys
sys.path.append('/path/to/') # the parent directory of the project
sys.path.append('/path/to/project') # these lines only needed if not on path
os.environ['DJANGO_SETTINGS_MODULE'] = 'myproj.settings'

# imports and code below
๐Ÿ‘คMatt McCormick

6๐Ÿ‘

I just thought about this rather simple solution:

  1. Define a view function do_work(req, param) like you would with any other view, with URL mapping, return a HttpResponse and so on.
  2. Set up a cron job with your timing preferences (or using AT or Scheduled Tasks in Windows) which runs curl http://localhost/your/mapped/url?param=value.

You can add parameters but just adding parameters to the URL.

Tell me what you guys think.

[Update] Iโ€™m now using runjob command from django-extensions instead of curl.

My cron looks something like this:

@hourly python /path/to/project/manage.py runjobs hourly

โ€ฆ and so on for daily, monthly, etcโ€™. You can also set it up to run a specific job.

I find it more managable and a cleaner. Doesnโ€™t require mapping a URL to a view. Just define your job class and crontab and youโ€™re set.

๐Ÿ‘คMichael

4๐Ÿ‘

after the part of code,I can write anything just like my views.py ๐Ÿ™‚

#######################################
import os,sys
sys.path.append('/home/administrator/development/store')
os.environ['DJANGO_SETTINGS_MODULE']='store.settings'
from django.core.management impor setup_environ
from store import settings
setup_environ(settings)
#######################################

from
http://www.cotellese.net/2007/09/27/running-external-scripts-against-django-models/

๐Ÿ‘คxiaohei

4๐Ÿ‘

You should definitely check out django-q!
It requires no additional configuration and has quite possibly everything needed to handle any production issues on commercial projects.

Itโ€™s actively developed and integrates very well with django, django ORM, mongo, redis. Here is my configuration:

# django-q
# -------------------------------------------------------------------------
# See: http://django-q.readthedocs.io/en/latest/configure.html
Q_CLUSTER = {
    # Match recommended settings from docs.
    'name': 'DjangoORM',
    'workers': 4,
    'queue_limit': 50,
    'bulk': 10,
    'orm': 'default',

# Custom Settings
# ---------------
# Limit the amount of successful tasks saved to Django.
'save_limit': 10000,

# See https://github.com/Koed00/django-q/issues/110.
'catch_up': False,

# Number of seconds a worker can spend on a task before it's terminated.
'timeout': 60 * 5,

# Number of seconds a broker will wait for a cluster to finish a task before presenting it again. This needs to be
# longer than `timeout`, otherwise the same task will be processed multiple times.
'retry': 60 * 6,

# Whether to force all async() calls to be run with sync=True (making them synchronous).
'sync': False,

# Redirect worker exceptions directly to Sentry error reporter.
'error_reporter': {
    'sentry': RAVEN_CONFIG,
},
}
๐Ÿ‘คsaran3h

3๐Ÿ‘

Yes, the method above is so great. And I tried some of them. At last, I found a method like this:

    from threading import Timer

    def sync():

        do something...

        sync_timer = Timer(self.interval, sync, ())
        sync_timer.start()

Just like Recursive.

Ok, I hope this method can meet your requirement. ๐Ÿ™‚

๐Ÿ‘คNi Xiaoni

3๐Ÿ‘

A more modern solution (compared to Celery) is Django Q:
https://django-q.readthedocs.io/en/latest/index.html

It has great documentation and is easy to grok. Windows support is lacking, because Windows does not support process forking. But it works fine if you create your dev environment using the Windows for Linux Subsystem.

๐Ÿ‘คdevdrc

2๐Ÿ‘

I had something similar with your problem today.

I didnโ€™t wanted to have it handled by the server trhough cron (and most of the libs were just cron helpers in the end).

So iโ€™ve created a scheduling module and attached it to the init .

Itโ€™s not the best approach, but it helps me to have all the code in a single place and with its execution related to the main app.

๐Ÿ‘คFabricio Buzeto

1๐Ÿ‘

I use celery to create my periodical tasks. First you need to install it as follows:

pip install django-celery

Donโ€™t forget to register django-celery in your settings and then you could do something like this:

from celery import task
from celery.decorators import periodic_task
from celery.task.schedules import crontab
from celery.utils.log import get_task_logger
@periodic_task(run_every=crontab(minute="0", hour="23"))
def do_every_midnight():
 #your code

1๐Ÿ‘

I am not sure will this be useful for anyone, since I had to provide other users of the system to schedule the jobs, without giving them access to the actual server(windows) Task Scheduler, I created this reusable app.

Please note users have access to one shared folder on server where they can create required command/task/.bat file. This task then can be scheduled using this app.

App name is Django_Windows_Scheduler

ScreenShot:
enter image description here

๐Ÿ‘คjust10minutes

0๐Ÿ‘

If you want something more reliable than Celery, try TaskHawk which is built on top of AWS SQS/SNS.

Refer: http://taskhawk.readthedocs.io

๐Ÿ‘คSri

0๐Ÿ‘

For simple dockerized projects, I could not really see any existing answer fit.

So I wrote a very barebones solution without the need of external libraries or triggers, which runs on its own. No external os-cron needed, should work in every environment.

It works by adding a middleware: middleware.py

import threading

def should_run(name, seconds_interval):
    from application.models import CronJob
    from django.utils.timezone import now

    try:
        c = CronJob.objects.get(name=name)
    except CronJob.DoesNotExist:
        CronJob(name=name, last_ran=now()).save()
        return True

    if (now() - c.last_ran).total_seconds() >= seconds_interval:
        c.last_ran = now()
        c.save()
        return True

    return False


class CronTask:
    def __init__(self, name, seconds_interval, function):
        self.name = name
        self.seconds_interval = seconds_interval
        self.function = function


def cron_worker(*_):
    if not should_run("main", 60):
        return

    # customize this part:
    from application.models import Event
    tasks = [
        CronTask("events", 60 * 30, Event.clean_stale_objects),
        # ...
    ]

    for task in tasks:
        if should_run(task.name, task.seconds_interval):
            task.function()


def cron_middleware(get_response):

    def middleware(request):
        response = get_response(request)
        threading.Thread(target=cron_worker).start()
        return response

    return middleware

models/cron.py:

from django.db import models


class CronJob(models.Model):
    name = models.CharField(max_length=10, primary_key=True)
    last_ran = models.DateTimeField()

settings.py:

MIDDLEWARE = [
    ...
    'application.middleware.cron_middleware',
    ...
]
๐Ÿ‘คyspreen

0๐Ÿ‘

Simple way is to write a custom shell command see Django Documentation and execute it using a cronjob on linux. However i would highly recommend using a message broker like RabbitMQ coupled with celery. Maybe you can have a look at
this Tutorial

๐Ÿ‘คHamfri

0๐Ÿ‘

One alternative is to use Rocketry:

from rocketry import Rocketry
from rocketry.conds import daily, after_success

app = Rocketry()

@app.task(daily.at("10:00"))
def do_daily():
    ...

@app.task(after_success(do_daily))
def do_after_another():
    ...

if __name__ == "__main__":
    app.run()

It also supports custom conditions:

from pathlib import Path

@app.cond()
def file_exists(file):
    return Path(file).exists()

@app.task(daily & file_exists("myfile.csv"))
def do_custom():
    ...

And it also supports Cron:

from rocketry.conds import cron

@app.task(cron('*/2 12-18 * Oct Fri'))
def do_cron():
    ...

It can be integrated quite nicely with FastAPI and I think it could be integrated with Django as well as Rocketry is essentially just a sophisticated loop that can spawn, async tasks, threads and processes.

Disclaimer: Iโ€™m the author.

๐Ÿ‘คmiksus

0๐Ÿ‘

Another option, similar to Brian Nealโ€™s answer it to use RunScripts

Then you donโ€™t need to set up commands. This has the advantage of more flexible or cleaner folder structures.

This file must implement a run() function. This is what gets called when you run the script. You can import any models or other parts of your django project to use in these scripts.

And then, just

python manage.py runscript path.to.script
๐Ÿ‘คRoman

Leave a comment