Why doesn't Python release the memory when I delete a large object?
If you create a large object and delete it again, Python has probably released the memory, but the memory allocators involved don’t necessarily return the memory to the operating system, so it may look as if the Python process uses a lot more virtual memory than it actually uses.
Memory allocation works at several levels in Python. There’s the system’s own allocator, which is what shows up when you check the memory use using the Windows Task Manager or ps. Then there’s the C runtime’s memory allocator (malloc), which gets memory from the system allocator, and hands it out in smaller chunks to the application. Finally, there’s Python’s own object allocator, which is used for objects up to 256 bytes. This allocator grabs large chunks of memory from the C allocator, and chops them up in smaller pieces using an algorithm carefully tuned for Python.
Exactly if and when Python’s allocator returns memory to the C runtime, and when the C runtime returns memory to the operating system, depends on a lot of parameters, including Python and library versions, your application’s object allocation patterns, and so on. For example, CPython 2.5 and later returns memory used for smaller objects to the C allocator when possible, while earlier versions never did that. However, it’s important to remember that memory that’s returned to a given allocator will be reused by that allocator, even if it’s not returned to the system.
Another possible cause for excessive memory usage is that Python uses so-called “free lists” for certain object types, including integers and floats. Tim Peters writes:
If I do
>>> L = range(50*1024*100) >>> del L
Python is still using more than 60 MB. Why isn’t the memory released?
It’s that you’ve created 5 million integers simultaneously alive, and each int object consumes 12 bytes. “For speed”, Python maintains an internal free list for integer objects. Unfortunately, that free list is both immortal and unbounded in size. floats also use an immortal & unbounded free list.
/…/ Do you really need a list containing 5 million integers? I never do ;-) Something like
for i in xrange(50*1024*100): # note the "x" in "xrange" whatever
consumes a trivial amount of memory, because only two integers in the range are simultaneously alive at any point, and the free list makes reusing their space fast.