[Answer]-File upload and store, then proccessing with remote worker

1đź‘Ť

You say you’re using IronWorker so just include the models you need in each worker in your .worker file. For example, let’s say you have a worker called report_worker.py, the one that needs “Report” and “User”, point to each of those in the report_worker.worker file:

file '../models/user.py'
file '../models/report.py'

Or since you’re using Django, you might have all your models in models.py, so:

file 'models.py'

Then when either of those changes, just reupload the worker with the cli:

iron_worker upload report_worker

Then you can use the same models as what your app uses. Hope that helps!

More info on .worker files here: http://dev.iron.io/worker/reference/dotworker/

👤Travis Reeder

0đź‘Ť

Why do you really need the exact same models in your workers? You can design the worker to have a different model to perform it’s own actions on your data. Just design API’s for your data and access it separately from your main site.

If it really necessary, Django app can be shared across multiple projects. So you can just put some generic code in a separate app (like your shared models) and put them in sourcecontrol. After an update in your main website, you can easily update the workers also.

👤tuvokki

0đź‘Ť

There are few interesting options.

As example, you can add additional reupload workers step for deploy process. It’ll guarantee
consistence between deployed application and workers.

Using own (rest) api is great idea, i like it even more than sharing models between different beings

Leave a comment