5đź‘Ť
This is a Python and OS issue, not really a django or celery issue. Without getting too deep:
1) A process will never free memory addressing space once it has requested it from the OS. It never says “hey, I’m done here, you can have it back”. In the example you’ve given, I’d expect the process size to grow for a while, and then stabilize, possibly at a high base line. After your example allocation, you might call the gc
interface to force a garbage collect to see how
2) This isn’t usually a problem, because unused pages are paged out by the OS because your process stops accessing that address space that it has deallocated.
3) It is a problem if your process is leaking object references, preventing python from garbage collecting to re-appropriate the space for later reuse by that process, and requiring your process to ask for more address space from the OS. At some point, the OS cries uncle and will (probably) kill your process with its oomkiller or similar mechanism.
4) If you are leaking, either fix the leak or set CELERYD_MAX_TASKS_PER_CHILD
, and your child processes will (probably) commit suicide before upsetting the OS.
This is a good general discussion on Python’s memory management:
CPython memory allocation
And a few minor things:
Use xrange
not range
– range will generate all values then iterate over that list. xrange
is just a generator. Have set Django DEBUG=False?