[Django]-Celery task state always pending

8👍

So, your settings are wrong. 🙂 You also need to setup a broker for celery to work.

First of all, djcelery is deprecated, everything is included in celery for it to work with django.

Second of all, do not set all content to be accepted, it can be a potential security risk. Use pickle only in case that simple json is not enough (let’s say you pass functions or objects as arguments to tasks, or return from tasks)

So my guess is, you are just trying celery out, that’s why you are trying to use database backend, which is fine, but for production use I would recommend using RabbitMQ.

In any case, give it a try with those settings:

BROKER_URL = 'django://'
INSTALLED_APPS = (
    ...
    'kombu.transport.django', 
    ...
)
CELERY_RESULT_BACKEND = 'db+scheme://user:password@host:port/dbname' 
CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_IGNORE_RESULT = False # this is less important

then run python manage.py syncdb

Just to let you know, I have not used database as a broker or result backend, hence setup might be incomplete, or even incorrect, but give it a try anyways.

more CELERY_RESULT_BACKEND setting for database examples

In case you want to setup RabbitMQ as a broker backend, which I would recommend and I know for sure it will work:

if on ubuntu run:

sudo apt-get install rabbitmq-server
sudo rabbitmqctl add_user <username> <password>
sudo rabbitmqctl add_vhost <vhost, use project name for example>
sudo rabbitmqctl set_permissions -p <vhost> <username"> ".*" ".*" ".*"

Then configure celery in settings.py:

BROKER_URL = 'amqp://<user>:<password>@localhost:5672/<vhost>'
CELERY_TIMEZONE = TIME_ZONE
CELERY_RESULT_BACKEND = 'amqp'
# thats where celery will store scheduled tasks in case you restart the broker:
CELERYD_STATE_DB = "/full/path/data/celery_worker_state" 
CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'

Let me know how it goes.

👤lehins

15👍

Straight from doc: Result backend does not work or tasks are always in PENDING state.

All tasks are PENDING by default, so the state would have been better named “unknown”. Celery does not update any state when a task is sent, and any task with no history is assumed to be pending (you know the task id after all).

  1. Make sure that the task does not have ignore_result enabled.

    Enabling this option will force the worker to skip updating states.

  2. Make sure the CELERY_IGNORE_RESULT setting is not enabled.

  3. Make sure that you do not have any old workers still running.

    It’s easy to start multiple workers by accident, so make sure that the previous worker is properly shutdown before you start a new one.

    An old worker that is not configured with the expected result backend may be running and is hijacking the tasks.

    The –pidfile argument can be set to an absolute path to make sure this doesn’t happen.

  4. Make sure the client is configured with the right backend.

If for some reason the client is configured to use a different backend than the worker, you will not be able to receive the result, so make sure the backend is correct by inspecting it:

>>> result = task.delay(…)
>>> print(result.backend)

0👍

If you are using old django-celery and RabbitMQ as result backend, then these settings may help:

# Mostly, all settings are the same as in other answers

CELERY_RESULT_BACKEND = 'rpc://'
CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_IGNORE_RESULT = False

# This line is what I needed
CELERY_TRACK_STARTED = True

Leave a comment