1👍
It’s never a good idea to start threads from views like that.
The most common way to solve this is to delegate the work to a separate worker/process. In addition to django you have another python process looking for work.
You can keep it simple and make the Django view store information about the work that needs to be done in some format on disk or in a database (this would be the work queue). Then the worker process will run in a loop checking for available work every N seconds. Increase the number of workers/processes to add more computing power (limited by your hardware of course)
The http request that creates the work request can return a job_id
the user can query to get the status of the job. Is the job pending/in progress/done? Then maybe the user can also fetch the result of the job or even metadata such as duration and logs?
There are also frameworks out there to solve problems like this such as Celery and django-channels. Celery is probably easier to start with, but it might be overkill for what you are trying to do.
The advantage of using workers like this is that you can have a very light weight REST api in the front and you can scale up the number of workers, possibly across multiple servers as the demand increases.
1👍
I finally solved this problem by using python subprocess
I make my “compute” code in a separate file and use the Popen
to call it.
from subprocess import Popen
def submit(request):
#some prepare
........
# call the engine
p = Popen(["python", "compute.py", <arguments>])
return HttpResponse("started")
However as suggested, subprocess
is not a very good and safe practice. This solution is easy to realize but I will try to convert the back-end to worker mode
by using Celery
or django-channels
as @Grimmy suggested.
- [Answered ]-Pip is rolling back uninstall of setuptools
- [Answered ]-Django sign up user in admin site with generated password
- [Answered ]-Django classbased view not accepting view_args and view_kwargs
- [Answered ]-Django forms – rendering form and redirecting