According to Python wiki: Time complexity, set is implemented as a hash table. So you can expect to lookup/insert/delete in O(1) average. Unless your hash table's load factor is too high, then you face collisions and O(n).

P.S. for some reason they claim O(n) for delete operation which looks like a mistype.

P.P.S. This is true for CPython, pypy is a different story.

Answer from Sergey Romanovsky on Stack Overflow
🌐
Python
wiki.python.org › moin › TimeComplexity
TimeComplexity
As seen in the source code the complexities for set difference s-t or s.difference(t) (set_difference()) and in-place set difference s.difference_update(t) (set_difference_update_internal()) are different! The first one is O(len(s)) (for every element in s add it to the new set, if not in t).
Discussions

Time complexity of sets
Ok, so sets/dictionaries work by hashing the index value. So that it's a constant time to find the item. You don't iterate through the set/dictionary. You just simply ask what is the value at this address? Let's say there are a number of people living on a street, everyone lives at the address that matches the length of their last name, and I told you got go to "smith" You wouldn't spend time checking houses to find smith, you would immediately go to house 5. The constant time spent was converting smith to 5. It would take you the same constant time to find where Scot or Johnson lived. That's how a hash works, it converts whatever value you have into an address in memory. It gets a bit more complex than just "length" and there is code in place to handle collisions (smith and jones are not at the same address). But that's the simple version of it. I understand that sets are data structures where all its elements are sorted They're not sorted. They're unordered. In recent version of python dictionaries maintain "insertion order". More on reddit.com
🌐 r/learnpython
4
1
March 17, 2021
Time complexity of python set operations?
Python sets are powerful data ... tables, similar to dictionaries in Python but only storing keys without associated values. Due to their hash table implementation, most of the common operations on sets have efficient time complexities.... More on designgurus.io
🌐 designgurus.io
1
10
June 21, 2024
What's the time complexity of set.add()?
when I was reading the python docs at this part click here I had a slight doubt that whenever we perform any set.add(item) does this happens to traverse the entire set for item equality with the elements already inside the set. Code: class Foo: def __eq__(self, other): print("Called me.") return ... More on discuss.python.org
🌐 discuss.python.org
0
0
March 27, 2024
python - What is time complexity of a list to set conversion? - Stack Overflow
I've noticed the table of the time complexity of set operations on the python official website. But i just wanna ask what's the time complexity of converting a list to a set, for instance, l = [1,... More on stackoverflow.com
🌐 stackoverflow.com
🌐
GeeksforGeeks
geeksforgeeks.org › python › internal-working-of-set-in-python
Internal working of Set in Python - GeeksforGeeks
July 11, 2025 - The difference between two sets can be found using the difference() method or the - operator. The time complexity for this operation is O(len(s1)), where s1 is the set from which elements are being subtracted.
🌐
AskPython
askpython.com › home › runtime complexities of data structures in python
Runtime Complexities of Data Structures in Python - AskPython
December 23, 2021 - In this article, we will be looking at the different types of runtime complexities associated with programming algorithms. We will be looking at time and space complexities, different case scenarios, and specific time complexities. We will also be looking up at time complexities of different python ...
🌐
Reddit
reddit.com › r/learnpython › time complexity of sets
r/learnpython on Reddit: Time complexity of sets
March 17, 2021 -

I understand that sets are data structures where all its elements are sorted and it doesn't contain any duplicate values, but why is their time complexity just O(1)?

How can it be a constant value, even if the set contains millions of elements?

I thought that the complexity was O(n*log(n)) due to a binary search, but looks like it's even faster and I can't really understand how.

Thanks in advance for any answer!

Top answer
1 of 3
2
Ok, so sets/dictionaries work by hashing the index value. So that it's a constant time to find the item. You don't iterate through the set/dictionary. You just simply ask what is the value at this address? Let's say there are a number of people living on a street, everyone lives at the address that matches the length of their last name, and I told you got go to "smith" You wouldn't spend time checking houses to find smith, you would immediately go to house 5. The constant time spent was converting smith to 5. It would take you the same constant time to find where Scot or Johnson lived. That's how a hash works, it converts whatever value you have into an address in memory. It gets a bit more complex than just "length" and there is code in place to handle collisions (smith and jones are not at the same address). But that's the simple version of it. I understand that sets are data structures where all its elements are sorted They're not sorted. They're unordered. In recent version of python dictionaries maintain "insertion order".
2 of 3
2
As others have pointed out, these are implemented with hash tables. Hashing is when you generate some pseudorandom number from some input data. In a hash table, that number is clipped (modulo) so as to fit inside the table. Ideally, different data will always get you a different number so you end up in the right spot of the hash table in constant time, but that's obviously not always going to happen and you will get so-called hash collisions. When those happen, some sort of strategy is necessary to deal with them and since you'd ideally design your hash table so they don't happen very often, that strategy tends to just be to use the next spot in the table, and then just linearly search. In that sense, it's not exactly a constant-time algorithm, but you really should only be searching a very small potion of the full table, so it's close. As the table fills up (its "load factor" increases), this cost generally grows, although that is not universally true (e.g., when perfect hashing is an option). It can also happen that the hash table needs to be grown, which will generally not be a constant-time operation. There are all sorts of strategies for that. More often than not, though, the hashing step will not lead to a collision, and you get O(1) performance.
🌐
DEV Community
dev.to › williams-37 › understanding-time-complexity-in-python-functions-5ehi
Understanding Time Complexity in Python Functions - DEV Community
October 25, 2024 - Finding the length of a list, dictionary, or set is a constant time operation. List Comprehensions: [expression for item in iterable] → O(n) The time complexity of list comprehensions is linear, as they iterate through the entire iterable.
Find elsewhere
🌐
Python Morsels
pythonmorsels.com › time-complexities
Python Big O: the time complexities of different data structures in Python - Python Morsels
April 16, 2024 - For example, sets are faster at key lookups than lists, but they have no ordering. Dictionaries are just as fast at key lookups as sets and they maintain item insertion order, but they require more memory. In day-to-day Python usage, time complexity tends to matter most for avoiding loops within ...
🌐
W3Schools
w3schools.com › python › python_dsa_lists.asp
Python Lists and Arrays
Each algorithm in this tutorial ... with its time complexity. ... If you want to use W3Schools services as an educational institution, team or enterprise, send us an e-mail: sales@w3schools.com · If you want to report an error, or if you want to make a suggestion, send us an e-mail: help@w3schools.com · HTML Tutorial CSS Tutorial JavaScript Tutorial How To Tutorial SQL Tutorial Python Tutorial W3.CSS ...
🌐
Data Basecamp
databasecamp.de › en › python-coding › time-complexity
What is Time Complexity? | Data Basecamp
January 3, 2024 - These tools help identify potential ... In Python, you can use built-in modules like timeit or third-party libraries like cProfile and line_profiler to profile your code and measure its time complexity....
🌐
Python.org
discuss.python.org › python help
What's the time complexity of set.add()? - Python Help - Discussions on Python.org
March 27, 2024 - when I was reading the python docs at this part click here I had a slight doubt that whenever we perform any set.add(item) does this happens to traverse the entire set for item equality with the elements already inside the set. Code: class Foo: def __eq__(self, other): print("Called me.") return id(self) == id(other) def __hash__(self): return 1 def __repr__(self): return "Dummy()" s = {Foo(), Foo(), Foo(), Foo()} print("==========") s.add(Foo...
🌐
Plain English
python.plainenglish.io › understanding-algorithm-time-complexity-with-python-ecbe57e7cb5f
Understanding Algorithm Time Complexity With Python | by Marcus Sena | Python in Plain English
August 19, 2024 - For example, if an algorithm has a complexity f(n) = 3n² + 2n + 5, the term n² dominates as n becomes very large, making the Big O notation O(n²). Using the assumptions and properties presented earlier, we can create a simple Python function that calculates the elapsed time of execution of a function for different input sizes and plots the calculated execution times against the input sizes.
🌐
eLife
elifesciences.org › reviewed-preprints › 110428
A high-throughput assay for the measurement of Ca2+-oscillations and insulin release from uniformly sized β-cell spheroids
3 weeks ago - The baseline was subtracted from ... Python library29. Minimum peak height was set to a multiple of the standard deviation of the signal during basal levels of extracellular glucose (specified in-text), minimum width was set at 10 seconds and a minimal horizontal distance between peaks centres of 60 seconds. The normalized area under the peaks (AuP) was defined as the sum of the fluorescent signal for each timepoint after baseline ...
🌐
GeeksforGeeks
geeksforgeeks.org › python › time-complexity-of-a-list-to-set-conversion-in-python
Time Complexity of A List to Set Conversion in Python - GeeksforGeeks
July 23, 2025 - The time to convert small list to set : 0.0 The set is : {1, 2, 3, 4, 5} The time to convert large list to set : 0.21737 · The time complexity of list to set conversion is O(n) where n is the number of element in the list.
🌐
Medium
binarybeats.medium.com › python-set-data-structure-methods-use-time-and-space-complexity-366b8c408345
Python Set Data Structure: Methods, Use, Time, and Space Complexity | by Binary Beats | Medium
April 22, 2023 - By using sets, we can solve this problem efficiently in O(n) time complexity, where n is the total number of elements in both lists.
🌐
AWS
aws.amazon.com › blogs › big-data › zero-etl-integrations-with-amazon-opensearch-service
Zero-ETL integrations with Amazon OpenSearch Service | Amazon Web Services
3 weeks ago - Amazon OpenSearch Service direct queries with Amazon S3 provides a zero-ETL integration to reduce the operational complexity of duplicating data or managing multiple analytics tools by enabling you to directly query their operational data, reducing costs and time to action.
🌐
Wikipedia
en.wikipedia.org › wiki › Hash_table
Hash table - Wikipedia
1 week ago - The .NET standard library includes HashSet and Dictionary, so it can be used from languages such as C# and VB.NET. ... ^ There are approaches with a worst-case expected time complexity of O(log2 (1 - α)-1) where α is the load factor.
🌐
DEV Community
dev.to › iihsan › time-complexity-analysis-of-python-methods-bigo-notations-for-list-tuple-set-and-dictionary-methods-47l9
Time Complexity Analysis of Python Methods: Big(O) Notations for List, Tuple, Set, and Dictionary Methods - DEV Community
January 15, 2024 - The time complexity of a function is measured in Big(O) notations that give us information about how fast a function grows subject to input sizes. The following are different notations with examples while calculating the time complexity of any piece of code:
🌐
Code Like A Girl
code.likeagirl.io › time-complexities-of-python-dictionary-and-set-operations-ee13511a2881
Time Complexities of Python Dictionary and Set Operations | by Python Code Nemesis | Code Like A Girl
November 7, 2023 - The time complexity of inserting an element into a Python set is typically O(1). This means that the time taken to insert an element into a set does not depend on the size of the set. The O1) time complexity for insertion is achievable because sets in Python are implemented using hash tables, ...