The memory assigned is not disproportional; you are creating 100,000 objects! As you can see, they take up roughly 34 megabytes of space:

>>> sys.getsizeof(Test())+sys.getsizeof(Test().__dict__)
344
>>> (sys.getsizeof(Test())+sys.getsizeof(Test().__dict__)) * 1000000 / 10**6
34.4 #megabytes

You can get a minor improvement with __slots__, but you will still need about 20MB of memory to store those 100,000 objects.

>>> sys.getsizeof(Test2())+sys.getsizeof(Test2().__slots__)
200
>>> sys.getsizeof(Test2())+sys.getsizeof(Test2().__slots__) * 1000000 / 10**6
20.0 #megabytes

(With credit to mensi's answer, sys.getsizeof is not taking into account references. You can autocomplete to see most of the attributes of an object.)

See SO answer: Usage of __slots__? http://docs.python.org/release/2.5.2/ref/slots.html

To use __slots__:

class Test2():
    __slots__ = ['a','b','c','d','e']

    def __init__(self):
        ...
Answer from ninjagecko on Stack Overflow
🌐
GitHub
gist.github.com › ff71b0da9a3c2d732c31
sys getsizeof · GitHub
Save Averroes/ff71b0da9a3c2d732c31 to your computer and use it in GitHub Desktop. Download ZIP · sys getsizeof · Raw · sys_getsizeof.py · This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below.
🌐
GitHub
github.com › pandas-dev › pandas › issues › 11595
Optionally use sys.getsizeof in DataFrame.memory_usage · Issue #11595 · pandas-dev/pandas
November 13, 2015 - It might be nice to optionally map sys.getsizeof on object dtype columns to get a better estimate of the size.
Author   mrocklin
Discussions

BUG: Incorrect results from `sys.getsizeof()` for multi-dimensional arrays
Describe the issue: While sys.getsizeof() seems to work correctly for one-dimensional arrays, it gives, in my opinion, incorrect results for multi-dimensional arrays. import sys import numpy as np ... More on github.com
🌐 github.com
2
January 2, 2022
Size of python objects different? [Real memory vs sys.getsizeof()]

sys.getsizeof gives you the amount of memory allocated to the list itself, but you also have 10...00 int objects that the list only contains a pointer to.

More on reddit.com
🌐 r/learnpython
6
4
November 11, 2016
python - sys.getsizeof() results don't quite correlate to structure size - Stack Overflow
I am trying to create a list of size 1 MB. while the following code works: dummy = ['a' for i in xrange(0, 1024)] sys.getsizeof(dummy) Out[1]: 9032 More on stackoverflow.com
🌐 stackoverflow.com
April 29, 2017
What is the difference between len() and sys.getsizeof() methods in python? - Stack Overflow
sys.getsizeof() on the other hand returns the memory size of the object: More on stackoverflow.com
🌐 stackoverflow.com
🌐
GitHub
gist.github.com › Averroes › 1992eb3e1349fb6a58a3
sys getsizeof object · GitHub
Save Averroes/1992eb3e1349fb6a58a3 to your computer and use it in GitHub Desktop. Download ZIP · sys getsizeof object · Raw · sys_getsizeof_object.py · This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below.
🌐
Ned Batchelder
nedbatchelder.com › blog › 202002 › sysgetsizeof_is_not_what_you_want
sys.getsizeof is not what you want | Ned Batchelder
February 9, 2020 - But the fact is, sys.getsizeof is almost never what you want, for two reasons: it doesn’t count all the bytes, and it counts the wrong bytes.
🌐
GitHub
github.com › numpy › numpy › issues › 20707
BUG: Incorrect results from `sys.getsizeof()` for multi-dimensional arrays · Issue #20707 · numpy/numpy
January 2, 2022 - obj_sizes = set() count = 0 for name, obj in vars(np).items(): if type(obj) is type and np.number in obj.mro(): for size in [10, 100, 1_000, 10_000]: arr = np.arange(size, dtype=obj) diff = sys.getsizeof(arr) - arr.nbytes obj_sizes.add(diff) count += 1 obj_sizes, count ({104}, 52)
Author   pya
🌐
Reddit
reddit.com › r/learnpython › size of python objects different? [real memory vs sys.getsizeof()]
r/learnpython on Reddit: Size of python objects different? [Real memory vs sys.getsizeof()]
November 11, 2016 -

Hi Pyople!

Yesterday I learned about sys.getsizeof() function and try some code. More specifically:

lst = [i for i in range(1000000000)]  # one mld numbers, creating for about a minute

When I use sys.getsizeof(lst), it returns: 8058558880. Which is correct. But when I look at my system resources in Linux Centos7 IPython (Python 3.4) I see: ipython Memory: 39592564 K Shared Mem: 5176 K - That's freaking 40GB.

I don't understand why, if a object is 8 GB in size, takes 40 KGB system memory. I tried it in list that had around 400 MB and system took 400 * 5 (approx) = 2 GB (approx)

Why is it taking 5-times more memory than it should? Or is the problem only because I tried it in iPython / Konsole? And in program it wouldn't be a problem?

Top answer
1 of 1
4

If you check the size of a list, it will be provide the size of the list data structure, including the pointers to its constituent elements. It won't consider the size of elements.

str1_size = sys.getsizeof(['a' for i in xrange(0, 1024)])
str2_size = sys.getsizeof(['abc' for i in xrange(0, 1024)])
int_size = sys.getsizeof([123 for i in xrange(0, 1024)])
none_size = sys.getsizeof([None for i in xrange(0, 1024)])
str1_size == str2_size == int_size == none_size

The size of empty list: sys.getsizeof([]) == 72
Add an element: sys.getsizeof([1]) == 80
Add another element: sys.getsizeof([1, 1]) == 88
So each element adds 4 bytes.
To get 1024 bytes, we need (1024 - 72) / 8 = 119 elements.

The size of the list with 119 elements: sys.getsizeof([None for i in xrange(0, 119)]) == 1080.
This is because a list maintains an extra buffer for inserting more items, so that it doesn't have to resize every time. (The size comes out to be same as 1080 for number of elements between 107 and 126).

So what we need is an immutable data structure, which doesn't need to keep this buffer - tuple.

empty_tuple_size = sys.getsizeof(())                     # 56
single_element_size = sys.getsizeof((1,))                # 64
pointer_size = single_element_size - empty_tuple_size    # 8
n_1mb = (1024 - empty_tuple_size) / pointer_size         # (1024 - 56) / 8 = 121
tuple_1mb = (1,) * n_1mb
sys.getsizeof(tuple_1mb) == 1024

So this is your answer to get a 1MB data structure: (1,)*121

But note that this is only the size of tuple and the constituent pointers. For the total size, you actually need to add up the size of individual elements.


Alternate:

sys.getsizeof('') == 37
sys.getsizeof('1') == 38     # each character adds 1 byte

For 1 MB, we need 987 characters:

sys.getsizeof('1'*987) == 1024

And this is the actual size, not just the size of pointers.

Find elsewhere
🌐
GitHub
gist.github.com › shawnbutts › 3906915
bytes to to mb, gb, etc in python · GitHub
bytes to to mb, gb, etc in python · Raw · bytesto.py · This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode characters · Show hidden characters · Copy link · when i use the function of sys.getsizeof() , how to Convert into mb, gb, etc .
🌐
GitHub
github.com › micropython › micropython › blob › master › tests › basics › sys_getsizeof.py
micropython/tests/basics/sys_getsizeof.py at master · micropython/micropython
# test sys.getsizeof() function · · import sys · try: sys.getsizeof · except AttributeError: print('SKIP') raise SystemExit · · print(sys.getsizeof([1, 2]) >= 2) print(sys.getsizeof({1: 2}) >= 2) · class A: pass · print(sys.getsizeof(A()) > 0) ·
Author   micropython
Top answer
1 of 2
85

They are not the same thing at all.

len() queries for the number of items contained in a container. For a string that's the number of characters:

Return the length (the number of items) of an object. The argument may be a sequence (string, tuple or list) or a mapping (dictionary).

sys.getsizeof() on the other hand returns the memory size of the object:

Return the size of an object in bytes. The object can be any type of object. All built-in objects will return correct results, but this does not have to hold true for third-party extensions as it is implementation specific.

Python string objects are not simple sequences of characters, 1 byte per character.

Specifically, the sys.getsizeof() function includes the garbage collector overhead if any:

getsizeof() calls the object’s __sizeof__ method and adds an additional garbage collector overhead if the object is managed by the garbage collector.

String objects do not need to be tracked (they cannot create circular references), but string objects do need more memory than just the bytes per character. In Python 2, __sizeof__ method returns (in C code):

Py_ssize_t res;
res = PyStringObject_SIZE + PyString_GET_SIZE(v) * Py_TYPE(v)->tp_itemsize;
return PyInt_FromSsize_t(res);

where PyStringObject_SIZE is the C struct header size for the type, PyString_GET_SIZE basically is the same as len() and Py_TYPE(v)->tp_itemsize is the per-character size. In Python 2.7, for byte strings, the size per character is 1, but it's PyStringObject_SIZE that is confusing you; on my Mac that size is 37 bytes:

>>> sys.getsizeof('')
37

For unicode strings the per-character size goes up to 2 or 4 (depending on compilation options). On Python 3.3 and newer, Unicode strings take up between 1 and 4 bytes per character, depending on the contents of the string.

For containers such as dictionaries or lists that reference other objects, the memory size given covers only the memory used by the container and the pointer values used to reference those other objects. There is no straightforward method of including the memory size of the ‘contained’ objects because those same objects could have many more references elsewhere and are not necessarily owned by a single container.

The documentation states it like this:

Only the memory consumption directly attributed to the object is accounted for, not the memory consumption of objects it refers to.

If you need to calculate the memory footprint of a container and anything referenced by that container you’ll have to use some method of traversing to those contained objects and get their size; the documentation points to a recursive recipe.

2 of 2
2

key difference is that len() will give actual length of elements in container , Whereas sys.getsizeof() will give it's memory size which it occupy

for more information read docs of python which is available at https://docs.python.org/3/library/sys.html#module-sys

🌐
GitHub
github.com › python › cpython › issues › 128762
`sys.getsizeof()` does not include inline values · Issue #128762 · python/cpython
January 12, 2025 - @dataclass class Node: cdr: Self | None BIG_DATA = None for i in range(50): BIG_DATA = Node(BIG_DATA) print(sys.getsizeof(BIG_DATA))
Author   hashbrowncipher
🌐
w3resource
w3resource.com › python-exercises › python-basic-exercise-79.php
Python: Get the size of an object in bytes - w3resource
May 17, 2025 - import sys # Import the sys module to use sys.getsizeof() # Define three strings and assign values to them str1 = "one" str2 = "four" str3 = "three" x = 0 y = 112 z = 122.56 # Print the size in bytes of each variable print("Size of ", str1, "=", str(sys.getsizeof(str1)) + " bytes") print("Size of ", str2, "=", str(sys.getsizeof(str2)) + " bytes") print("Size of ", str3, "=", str(sys.getsizeof(str3)) + " bytes") print("Size of", x, "=", str(sys.getsizeof(x)) + " bytes") print("Size of", y, "=", str(sys.getsizeof(y)) + " bytes") # Define a list and assign values to it L = [1, 2, 3, 'Red', 'Black
🌐
Envato Tuts+
code.tutsplus.com › home › python
Understand How Much Memory Your Python Objects Use | Envato Tuts+
May 20, 2022 - It just contains an 8-byte (on 64-bit versions of CPython) pointer to the actual int object. What that means is that the getsizeof() function doesn't return the actual memory of the list and all the objects it contains, but only the memory of ...
🌐
Reddit
reddit.com › r/learnpython › why does a 9 gb list appear to use 40 gb of memory?
r/learnpython on Reddit: Why does a 9 GB list appear to use 40 GB of memory?
October 9, 2021 -

Can someone help me understand what's going on here? My OS reports that a python process whose only large object is a 9 GB list is consuming 40.6 GB of system memory. I repeated this test several times with both the interactive and standard interpreters and the results are pretty consistent.

import sys
import psutil

#memory in use prior to generating list
prior_used = psutil.virtual_memory().used

print(f"Prior used: {round(prior_used/1e9, 2)} GB")
z = [*range(1000000000)]
list_size = sys.getsizeof(z)
print(f"List size: {round(list_size/1e9, 2)} GB")

#memory in use after to generating list
post_used = psutil.virtual_memory().used
print(f"Post used: {round(post_used/1e9, 2)} GB")

difference = post_used - prior_used
print(f"Memory used by list: {round(difference/1e9, 2)} GB")

#clear the list
z = None
after_deleting = psutil.virtual_memory().used
print(f"Memory used after clearing the list: {round(after_deleting/1e9, 2)} GB")

# output:
# Prior used: 3.27 GB
# List size: 9.0 GB
# Post used: 43.87 GB
# Memory used by list: 40.6 GB
# Memory used after clearing the list: 3.28 GB
🌐
GitHub
github.com › the-gigi › deep › blob › master › deeper.py
deep/deeper.py at master · the-gigi/deep
from sys import getsizeof · · · def deep_compare(a, b, pointer='/'): if a == b: return · · if type(a) != type(b): reason = 'Different data types' extra = str((type(a), type(b))) x(pointer, reason, extra) ·
Author   the-gigi
🌐
GitHub
github.com › python › cpython › issues › 103131
`sys.getsizeof` and `sys.set_asyncgen_hooks` are not converted to AC · Issue #103131 · python/cpython
March 30, 2023 - I found this while working on python/typeshed#9987 I've noticed that most of the functions inside sys do have __text_signature__. While sys.getsizeof doesn't. >>> import sys >>> sys.addaudithook.__text_signature__ '($module, /, hook)' >>...
Author   sobolevn
🌐
Stack Abuse
stackabuse.com › bytes › determining-the-size-of-an-object-in-python
Determining the Size of an Object in Python
September 8, 2023 - In this example, sys.getsizeof() returns the size of the list object my_list in bytes.
🌐
Codedamn
codedamn.com › news › python
How to Determine the Size of Objects in Python
July 2, 2023 - Python provides a built-in module named 'sys' which has a method called 'getsizeof()' that can be used to get the size of an object.