Here's a fuller interactive session that will help me explain what's going on (Python 2.6 on Windows XP 32-bit, but it doesn't matter really):

>>> import sys
>>> sys.getsizeof([])
36
>>> sys.getsizeof([1])
40
>>> lst = []
>>> lst.append(1)
>>> sys.getsizeof(lst)
52
>>> 

Note that the empty list is a bit smaller than the one with [1] in it. When an element is appended, however, it grows much larger.

The reason for this is the implementation details in Objects/listobject.c, in the source of CPython.

Empty list

When an empty list [] is created, no space for elements is allocated - this can be seen in PyList_New. 36 bytes is the amount of space required for the list data structure itself on a 32-bit machine.

List with one element

When a list with a single element [1] is created, space for one element is allocated in addition to the memory required by the list data structure itself. Again, this can be found in PyList_New. Given size as argument, it computes:

nbytes = size * sizeof(PyObject *);

And then has:

if (size <= 0)
    op->ob_item = NULL;
else {
    op->ob_item = (PyObject **) PyMem_MALLOC(nbytes);
    if (op->ob_item == NULL) {
        Py_DECREF(op);
        return PyErr_NoMemory();
    }
    memset(op->ob_item, 0, nbytes);
}
Py_SIZE(op) = size;
op->allocated = size;

So we see that with size = 1, space for one pointer is allocated. 4 bytes (on my 32-bit box).

Appending to an empty list

When calling append on an empty list, here's what happens:

  • PyList_Append calls app1
  • app1 asks for the list's size (and gets 0 as an answer)
  • app1 then calls list_resize with size+1 (1 in our case)
  • list_resize has an interesting allocation strategy, summarized in this comment from its source.

Here it is:

/* This over-allocates proportional to the list size, making room
* for additional growth.  The over-allocation is mild, but is
* enough to give linear-time amortized behavior over a long
* sequence of appends() in the presence of a poorly-performing
* system realloc().
* The growth pattern is:  0, 4, 8, 16, 25, 35, 46, 58, 72, 88, ...
*/
new_allocated = (newsize >> 3) + (newsize < 9 ? 3 : 6);

/* check for integer overflow */
if (new_allocated > PY_SIZE_MAX - newsize) {
    PyErr_NoMemory();
    return -1;
} else {
    new_allocated += newsize;
}

Let's do some math

Let's see how the numbers I quoted in the session in the beginning of my article are reached.

So 36 bytes is the size required by the list data structure itself on 32-bit. With a single element, space is allocated for one pointer, so that's 4 extra bytes - total 40 bytes. OK so far.

When app1 is called on an empty list, it calls list_resize with size=1. According to the over-allocation algorithm of list_resize, the next largest available size after 1 is 4, so place for 4 pointers will be allocated. 4 * 4 = 16 bytes, and 36 + 16 = 52.

Indeed, everything makes sense :-)

Answer from Eli Bendersky on Stack Overflow
Top answer
1 of 4
176

Here's a fuller interactive session that will help me explain what's going on (Python 2.6 on Windows XP 32-bit, but it doesn't matter really):

>>> import sys
>>> sys.getsizeof([])
36
>>> sys.getsizeof([1])
40
>>> lst = []
>>> lst.append(1)
>>> sys.getsizeof(lst)
52
>>> 

Note that the empty list is a bit smaller than the one with [1] in it. When an element is appended, however, it grows much larger.

The reason for this is the implementation details in Objects/listobject.c, in the source of CPython.

Empty list

When an empty list [] is created, no space for elements is allocated - this can be seen in PyList_New. 36 bytes is the amount of space required for the list data structure itself on a 32-bit machine.

List with one element

When a list with a single element [1] is created, space for one element is allocated in addition to the memory required by the list data structure itself. Again, this can be found in PyList_New. Given size as argument, it computes:

nbytes = size * sizeof(PyObject *);

And then has:

if (size <= 0)
    op->ob_item = NULL;
else {
    op->ob_item = (PyObject **) PyMem_MALLOC(nbytes);
    if (op->ob_item == NULL) {
        Py_DECREF(op);
        return PyErr_NoMemory();
    }
    memset(op->ob_item, 0, nbytes);
}
Py_SIZE(op) = size;
op->allocated = size;

So we see that with size = 1, space for one pointer is allocated. 4 bytes (on my 32-bit box).

Appending to an empty list

When calling append on an empty list, here's what happens:

  • PyList_Append calls app1
  • app1 asks for the list's size (and gets 0 as an answer)
  • app1 then calls list_resize with size+1 (1 in our case)
  • list_resize has an interesting allocation strategy, summarized in this comment from its source.

Here it is:

/* This over-allocates proportional to the list size, making room
* for additional growth.  The over-allocation is mild, but is
* enough to give linear-time amortized behavior over a long
* sequence of appends() in the presence of a poorly-performing
* system realloc().
* The growth pattern is:  0, 4, 8, 16, 25, 35, 46, 58, 72, 88, ...
*/
new_allocated = (newsize >> 3) + (newsize < 9 ? 3 : 6);

/* check for integer overflow */
if (new_allocated > PY_SIZE_MAX - newsize) {
    PyErr_NoMemory();
    return -1;
} else {
    new_allocated += newsize;
}

Let's do some math

Let's see how the numbers I quoted in the session in the beginning of my article are reached.

So 36 bytes is the size required by the list data structure itself on 32-bit. With a single element, space is allocated for one pointer, so that's 4 extra bytes - total 40 bytes. OK so far.

When app1 is called on an empty list, it calls list_resize with size=1. According to the over-allocation algorithm of list_resize, the next largest available size after 1 is 4, so place for 4 pointers will be allocated. 4 * 4 = 16 bytes, and 36 + 16 = 52.

Indeed, everything makes sense :-)

2 of 4
11

You're looking at how lists are allocated (and I think maybe you just wanted to see how big things were - in that case, use sys.getsizeof())

When something is added to a list, one of two things can happen:

  1. The extra item fits in spare space.

  2. Extra space is needed, so a new list is made, and the contents copied across, and the extra thing added.

Since (2) is expensive (copying things, even pointers, takes time proportional to the number of things to be copied, so grows as lists get large) we want to do it infrequently. So instead of just adding a little more space, we add a whole chunk. Typically the size of the amount added is similar to what is already in use - that way the maths works out that the average cost of allocating memory, spread out over many uses, is only proportional to the list size.

So what you are seeing is related to this behaviour. I don't know the exact details, but I wouldn't be surprised if [] or [1] (or both) are special cases, where only enough memory is allocated (to save memory in these common cases), and then appending does the "grab a new chunk" described above that adds more.

But I don't know the exact details - this is just how dynamic arrays work in general. The exact implementation of lists in Python will be finely tuned so that it is optimal for typical python programs. So all I am really saying is that you can't trust the size of a list to tell you exactly how much it contains - it may contain extra space, and the amount of extra free space is difficult to judge or predict.

A neat alternative to this is to make lists as (value, pointer) pairs, where each pointer points to the next tuple. In this way you can grow lists incrementally, although the total memory used is higher. That is a linked list (what Python uses is more like a vector or a dynamic array).

Eli's excellent answer explains that both [] and [1] are allocated exactly, but that appending to [] allocates an extra chunk. The comment in the code is what I am saying above (this is called "over-allocation" and the amount is porportional to what we have so that the average ("amortised") cost is proportional to size).

🌐
GeeksforGeeks
geeksforgeeks.org › python › find-the-size-of-a-list-python
Find the size of a list - Python - GeeksforGeeks
July 12, 2025 - The getsizeof() function belongs to the python's sys module. It has been implemented in the below example. ... import sys # sample lists list1 = [1, 2, 3, 5] list2 = ["GeeksForGeeks", "Data Structure", "Algorithms"] list3 = [1, "Geeks", 2, "For", ...
Discussions

How to get the size of a list when write to an file by bytes in python? - Stack Overflow
Do you have any idea what's the reason? In my 744bytes example, it get 700 instead of 744. 2014-01-10T02:55:09.643Z+00:00 ... I don't know the reason without knowing what the input file actually looks like. Is the data in list1 bytes or unicode? Are you on python2.x or 3.x? More on stackoverflow.com
🌐 stackoverflow.com
Why the size of object or array always coming same?
I am trying to implement my own Dynamic array data structure using python i did something like below now I want to check the size of my array how could i do it. import ctypes, sys class MeraList: def __init__(self): self.size = 1 # max items that can be stored self.present = 0 # number of items ... More on discuss.python.org
🌐 discuss.python.org
0
0
December 26, 2023
python - Size of list and string - Stack Overflow
I am trying to fetch the size of list in bytes and also size of string in bytes. If we see the output below for the code, size of list is shown as 52 bytes, where as when I join the list and check... More on stackoverflow.com
🌐 stackoverflow.com
memory management - How many bytes per element are there in a Python list (tuple)? - Stack Overflow
Added "shallow vs full size" note. 2008-09-26T20:02:32.43Z+00:00 ... A new function, getsizeof(), takes a Python object and returns the amount of memory used by the object, measured in bytes. More on stackoverflow.com
🌐 stackoverflow.com
🌐
Quora
quora.com › How-many-bytes-is-a-list-data-type-in-Python
How many bytes is a list data type in Python? - Quora
Answer: Firstly you need to know the two things before calculating this 1. Since list is mutable you would need to know the number of elements in the list to calculate the bytes it takes up in the memory 2. Python lists are not type specific so every element can be of any type each having differ...
🌐
Envato Tuts+
code.tutsplus.com › home › python
Understand How Much Memory Your Python Objects Use | Envato Tuts+
May 20, 2022 - The bytes object has an overhead of only 33 bytes. Lets look at lists. What's going on? An empty list takes 56 bytes, but each additional int adds just 8 bytes, where the size of an int is 28 bytes.
🌐
w3resource
w3resource.com › python-exercises › python-basic-exercise-79.php
Python: Get the size of an object in bytes - w3resource
Size of one = 52 bytes Size of ... Size of ('Red', [8, 4, 6], (1, 2, 3)) = 72 bytes Size of {'orange', 'pear', 'apple'} = 224 bytes Size of {'Name': 'David', 'Age': 6, 'Class': 'First'} = 224 bytes ......
Find elsewhere
🌐
Towards Data Science
towardsdatascience.com › home › latest › unexpected size of python objects in memory
Unexpected Size of Python Objects in Memory | Towards Data Science
March 5, 2025 - When you create a list object, the list object by itself takes 64 bytes of memory, and each item adds 8 bytes of memory to the size of the list because of references to other objects.
🌐
AskPython
askpython.com › home › variable’s memory size in python
Variable's memory size in Python - AskPython
April 29, 2023 - Using the getsizeof function, we get the size as 80 bytes. 56 as the overhead and 8 for each element. We know that this is just the space consumed by the list. The elements inside the list are not included. So to include the elements inside the list, too, we used the deep_getsizeof function we created. Using that, we get a total space of 250 bytes...
🌐
DEV Community
dev.to › dillir07 › here-is-how-python-list-s-memory-size-changes-w-r-t-it-s-length-28ah
Here is how Python list's memory size changes w.r.t it's length - DEV Community
February 1, 2020 - The first 10 values of memory changes is as follows. at list length 4, memory increase in bytes 32 at list length 8, memory increase in bytes 64 at list length 16, memory increase in bytes 72 at list length 25, memory increase in bytes 80 at list length 35, memory increase in bytes 88 at list length 46, memory increase in bytes 96 at list length 58, memory increase in bytes 112 at list length 72, memory increase in bytes 128 at list length 88, memory increase in bytes 144 at list length 106, memory increase in bytes 160
🌐
Catherineh
catherineh.github.io › programming › 2019 › 07 › 31 › memory-cost-of-a-python-list
Memory Cost of a Python List
Finally, if you look at the cpython source code for getsizeof, you will see: if (PyObject_IS_GC(o)) return ((size_t)size) + sizeof(PyGC_Head); getsizeof adds on the memory cost of garbage collection. So, finally, the breakdown of that 88 bytes for the size of [1,2,3] on 64 bit systems is:
🌐
TutorialsPoint
tutorialspoint.com › How-to-get-the-size-of-a-list-in-Python
How do we get the size of a list in Python?
November 8, 2022 - count=0 my_list = ['Hello', ... a positive integer representing the length of the object on which it is called. It implements the in-built len() function. The following program returns the size of the list using the __len__() function -...
🌐
Bobby Hadz
bobbyhadz.com › blog › python-get-length-of-bytes
Get the length of a Bytes object in Python | bobbyhadz
April 8, 2024 - Get the size of a String in Python · Use the len() function to get the length of a bytes object. The len() function returns the length (the number of items) of an object and can be passed a sequence (a bytes, string, list, tuple or range) or ...
🌐
Stack Overflow
stackoverflow.com › questions › 21035312 › how-to-get-the-size-of-a-list-when-write-to-an-file-by-bytes-in-python
How to get the size of a list when write to an file by bytes in python? - Stack Overflow
You are getting 700 instead of 744 because you use Windows and your file is 44 lines long. The strings have \n as their final character, which is translated by \r\n when it is written to the file. If you know that your program runs on Windows, then you can try print(len("".join(list1))+len(list1)) 2014-01-10T03:16:02.12Z+00:00
🌐
Python.org
discuss.python.org › python help
Why the size of object or array always coming same? - Python Help - Discussions on Python.org
December 26, 2023 - I am trying to implement my own Dynamic array data structure using python i did something like below now I want to check the size of my array how could i do it. import ctypes, sys class MeraList: def __init__(self): self.size = 1 # max items that can be stored self.present = 0 # number of items ...
🌐
GoShippo
goshippo.com › blog › measure-real-size-any-python-object
How to Measure the Real Size of Any Object in Python
April 14, 2025 - So I decided to fill in the gap. I wrote the helper below to recursively measure the size of a Python object (or dict, tuple, list etc).
Top answer
1 of 5
26

"It depends." Python allocates space for lists in such a way as to achieve amortized constant time for appending elements to the list.

In practice, what this means with the current implementation is... the list always has space allocated for a power-of-two number of elements. So range(1000000) will actually allocate a list big enough to hold 2^20 elements (~ 1.045 million).

This is only the space required to store the list structure itself (which is an array of pointers to the Python objects for each element). A 32-bit system will require 4 bytes per element, a 64-bit system will use 8 bytes per element.

Furthermore, you need space to store the actual elements. This varies widely. For small integers (-5 to 256 currently), no additional space is needed, but for larger numbers Python allocates a new object for each integer, which takes 10-100 bytes and tends to fragment memory.

Bottom line: it's complicated and Python lists are not a good way to store large homogeneous data structures. For that, use the array module or, if you need to do vectorized math, use NumPy.

PS- Tuples, unlike lists, are not designed to have elements progressively appended to them. I don't know how the allocator works, but don't even think about using it for large data structures :-)

2 of 5
15

Useful links:

How to get memory size/usage of python object

Memory sizes of python objects?

if you put data into dictionary, how do we calculate the data size?

However they don't give a definitive answer. The way to go:

  1. Measure memory consumed by Python interpreter with/without the list (use OS tools).

  2. Use a third-party extension module which defines some sort of sizeof(PyObject).

Update:

Recipe 546530: Size of Python objects (revised)

import asizeof

N = 1000000
print asizeof.asizeof(range(N)) / N
# -> 20 (python 2.5, WinXP, 32-bit Linux)
# -> 33 (64-bit Linux)