[Answered ]-How to duplicate Django Celery worker?

1👍

To do this you would have to configure the exchange to be a topic exchange (as you say):

CELERY_QUEUES = {
   'celery': {
       'exchange': 'celerytopic',
       'exchange_type': 'topic',
       'routing_key': 'celery',
   },
}

Then you can create your backup exchange using the AMQP api:

 from celery import current_app as celery

 with celery.broker_connection() as conn:
     conn.default_channel.queue_declare(queue='celery.backup', durable=True)
     conn.default_channel.queue_bind(queue='celery.backup',
                                     exchange='celerytopic',
                                     routing_key='celery',
                                     durable=True)

Since you already have a queue named celery you may have to delete that first:

$ camqadm queue.delete celery
👤asksol

1👍

It doesn’t make sense for me to try to start this task on two different machines. At least Celery cannot guarantee that a task will be run on different machines – it is RabbitMQ that distributes load, and if one node is less loaded than other – the two tasks run will be probably executed on that machine…

Use task.retry instead. Celery will retry a task, if it fails to execute. Celery is smart enough to understand if a task had failed. Just make sure to raise some exception if the tasks fail, and not to return silently if it cannot successfully log.

UPDATE:

A possible workflow could be – try to execute the task, if it fails, in on_retry change the routing_key, and try to execute task in a different exchange/queue which can be your fail-over queue.

👤Tisho

Leave a comment