[Django]-Progress bar with long web requests

4👍

You should keep in mind showing the progress bar may not be a good idea, since you can get timeouts or get your server suffer from submitting lot of simultaneous requests.

Put the zipping task in the queue and have it callback to notify the user somehow – by e-mail for instance – that the process has finished.

Take a look at django-lineup

Your code will look pretty much like:

from lineup import registry
from lineup import _debug

def create_archive(queue_id, queue):
    queue.set_param("zip_link", _create_archive(resource = queue.context_object, user = queue.user))
    return queue


def create_archive_callback(queue_id, queue):
    _send_email_notification(subject = queue.get_param("zip_link"), user = queue.user)
    return queue

registry.register_job('create_archive', create_archive, callback = create_archive_callback)

In your views, create queued tasks by:

    from lineup.factory import JobFactory
    j = JobFactory()
    j.create_job(self, 'create_archive', request.user, your_resource_object_containing_files_to_zip, { 'extra_param': 'value' })

Then run your queue processor (probably inside of a screen session):

./manage.py run_queue

Oh, and on the subject you might be also interested in estimating zip file creation time. I got pretty slick answers there.

👤ohnoes

2👍

Fun fact: You might be able to use a progress bar to trick users into thinking that things are going faster than they really are.

http://www.chrisharrison.net/projects/progressbars/index.html

👤Tyler

0👍

You could use a ‘log-file’ to keep track of the zipped files, and of how many files still remain.

The procedural way should be like this:

  1. Count the numbers of file, write it in a text file, in a format like totalfiles.filespreocessed
  2. Every file you zip, simply update the file

So, if you have to zip 3 files, the log file will grown as:

3.0 -> begin, no file still processed
3.1 -> 1 file on 3 processed, 33% task complete
3.2 -> 2 file on 3 processed, 66% task complete
3.3 -> 3 file on 3 processed, 100% task complete

And then with a simple ajax function (an interval) check the log-file every second.

In python, open, read and rite a file such small should be very quick, but maybe can cause some requests trouble if you’ll have many users doing that in the same time, but obviously you’ll need to create a log file for each request, maybe with rand name, and delete it after the task is completed.

A problem could be that, for let the ajax read the log-file, you’ll need to open and close the file handler in python every time you update it.

Eventually, for a more accurate progress meter, you culd even use the file size instead of the number of file as parameter.

👤Strae

-1👍

Better than a static page, show a Javascript dialog (using Shadowbox, JQuery UI or some custom method) with a throbber ( you can get some at hxxp://www.ajaxload.info/ ). You can also show the throbber in your page, without dialogs. Most users only want to know their action is being handled, and can live without reliable progress information (“Please wait, this could take some time…”)

JQUery UI also has a progress bar API. You could make periodic AJAX queries to a didcated page on your website to get a progress report and change the progress bar accordingly. Depending on how often the archiving is ran, how many users can trigger it and how you authenticate your users, this could be quite hard.

👤Xr.

Leave a comment