[Answer]-Sharing models (and tables) between two Django projects

1👍

In other words, you’re reinventing a task queue?

That is, the interface merely inserts records that represent “do this for me”, and later retrieves results form the “this was done for you” table (or the same table, doesn’t matter)?

What you’re really looking for is some kind of remote async rpc call interface, which yes, you could rebuild in this manner if you were inclined.

I would still recommend reevaluating celery – I’ve put that off on several occasions, but now that I have it set up it shocks me I didn’t use it earlier. You could even use the Django DB as the message queue backend (though I’d only say do that for low volume sites).

Anyway, as to the specific question:

There are no inherit issues with two independent processes using the same DB tables and neither Django nor your db connectors will add additional constraints in this regard.

You will need your worker process (“solve”) to periodically poll the DB looking for tasks to do, or send it a message (hint: celery!). Your ui client (“interface”) can just check the DB when the user refreshes.

From an implementation point of view, its probably simplest to share the code completely (all models, views, etc) across both projects. You’d have one process start the ui web server in the normal way, and for the worker, hooking up a custom management command is probably the simplest way to kickstart your worker’s loop.

You ‘may’ get some issues with db locking / race conditions if you don’t use select_for_update when you intend to write to the row. Alternatively, you could use .save(update_fields=zzz) to avoid contention, but that’s only in 1.5.

👤EB.

Leave a comment