[Django]-Multiple Django Celery Tasks are trying to save to the same object and failing

2👍

Django ORM can play a trick here. If you use model_object.save() method, it updates all the fields. If your tasks are updating different fields in the same object, you may consider using ModelClass.objects.filter(pk=model_id).update(some_field=some_value), but here you may fall into how different RDBMS implement table/row locking.

Another option is to use Celery Chord and update user profile on completion of all tasks fetching user data. You may need to implement a distributed semaphore, so the only chord task will be executing for the same user profile at the same time.

3👍

I assume you’re using django because you tagged it as such. If so, you can use select_for_update (documentation) to lock the objects. This will block the other workers until the transaction completes. If your tasks run a long time you could get timeouts so catch that exception and retry if necessary.

from django.db import transaction
from celery.task import task

@task
def mytask(mpk):
    with transaction.commit_on_success():
        my_obj = MyModel.objects.select_for_update().get(pk=mpk)
        ...

Note that this won’t work with sqlite.

👤joshua

0👍

Looks like it’s more a database lock issue. Have you tried to edit your configuration file and allow more concurrencies on your database ? For instance on Postgre Debian to edit your conf file :

nano /etc/postgresql/9.4/main/postgresql.conf

Then you could set something like this in the conf file :

max_connections=100
shared_buffers = 3000MB
temp_buffers = 800MB
effective_io_concurrency = 5
max_worker_processes = 15

This should allow you to read/write as you are discribing.

Leave a comment