data structures - Time complexity of python set operations? - Stack Overflow
Time complexity of python set operations?
Time complexity of sets
What's the time complexity of set.add()?
Videos
According to Python wiki: Time complexity, set is implemented as a hash table. So you can expect to lookup/insert/delete in O(1) average. Unless your hash table's load factor is too high, then you face collisions and O(n).
P.S. for some reason they claim O(n) for delete operation which looks like a mistype.
P.P.S. This is true for CPython, pypy is a different story.
The other answers do not talk about 2 crucial operations on sets: Unions and intersections. In the worst case, union will take O(n+m) whereas intersection will take O(min(x,y)) provided that there are not many element in the sets with the same hash. A list of time complexities of common operations can be found here: https://wiki.python.org/moin/TimeComplexity
I understand that sets are data structures where all its elements are sorted and it doesn't contain any duplicate values, but why is their time complexity just O(1)?
How can it be a constant value, even if the set contains millions of elements?
I thought that the complexity was O(n*log(n)) due to a binary search, but looks like it's even faster and I can't really understand how.
Thanks in advance for any answer!
class Solution:
def containsDuplicate(self, nums: List[int]) -> bool:
hashset = set()
for n in nums:
if n in hashset:
return True
hashset.add(n)
return FalseThis is the solution provided and they say its O(n). The for loop is O(n) and the "if n in hashset" is O(n) and they're nested so how isn't this O(n2)?