5👍
One way to guarantee that the Celery worker is configured to use the same test database as the tests is to spawn the Celery worker inside the test itself. This can be done by using start_worker
from celery.contrib.testing.worker import start_worker
from myproject.celery import app
def setUpClass(self):
start_worker(app)
method of the TestCase
.
You have to also use a SimpleTestCase
from Django or an APISimpleTestCase
from Rest rather than a plain TestCase
so that the Celery thread and the test thread can see the changes that each other make to the test database. The changes are still destroyed at the end of testing, but they are not destroyed between tests unless you manually destroy them in the tearDown
method.
4👍
I battled with a similar problem. The following solution is not clean but it works.
- Create a separate Django settings file that inherits from your main
one. Let’s call itintegration_testing.py
. -
Your file should look like this:
from .settings import *
DATABASES = {
'default': {
'ENGINE': '<your engine>',
'NAME': 'test_<your database name>',
'USER': <your db user>,
'PASSWORD': <your db password>,
'HOST': <your hostname>,
'PORT': <your port number>,
} -
Create a shell script which will set your environment and start up
the celery worker:#!/usr/bin/env bash
export DJANGO_SETTINGS_MODULE="YOURPROJECTNAME.settings.integration_testing"
celery purge -A YOURPROJECTNAME -f && celery worker -A YOURPROJECTNAME -l debug
-
The above works if you configured celery in this manner:
app = Celery('YOURPROJECTNAME')
app.config_from_object('django.conf:settings', namespace='CELERY')
-
Run the script in the background.
-
Make all tests that involve Celery inherit from TransactionTestCase (or APITransactionTestCase in django-rest-framework)
-
Run your unit tests that use celery. Any celery tasks will now use your test db. And hope for the best.
- Django: logging only for my project's apps
- Pycharm remote project with virtualenv
- How to test (using unittest) the HTML output of a Django view?
2👍
There’s no obvious problem with your code. You don’t need to run a celery worker. With these settings celery will run the task synchronously and won’t actually send anything to your message queue.
You can’t easily run tests with live celery workers anyway because each test is wrapped in a transaction so even if they were connecting to the same database (which they aren’t) the transactions are always rolled back by the test and never available to the worker.
If you really need to do this, look at this stackoverflow answer.
- Django Queryset of related objects, after prefiltering on original model
- Error Using CheckConstraint in Model.Meta along with Django GenericForeignKey – Joined field references are not permitted in this query
- Why is post_save being raised twice during the save of a Django model?
- Testing a session variable
0👍
I have found adding the following to conftest.py
works:
from django.conf import settings
...
@pytest.fixture(scope="session")
def celery_worker_parameters(django_db_setup):
assert settings.DATABASES["default"]["NAME"].startswith("test_")
return {}
The trick is to add here the django_db_setup
fixture so it will be enabled on the worker.
This was tests for tests marked with:
@pytest.mark.django_db(transaction=True)
@pytest.mark.celery()
def test_something(celery_worker):
...
0👍
"Question is, how can I get celery to use the same temporary db as the
rest of my tests?"
I solved it by running my tests using docker compose, making the database name configurable by environment variable, and setting the database name to test_db (normal db name is ‘db’).
But I don’t use sqlite…
If you need a solution with sqlite: Make Django test case database visible to Celery
- How to ensure task execution order per user using Celery, RabbitMQ and Django?
- Installed Virtualenv and activating virtualenv doesn't work
- Django: How to automatically change a field's value at the time mentioned in the same object?
- Django – Form not valid but no error