The python wiki on time complexity lists a single intersection as O(min(len(s), len(t)) where s and t are sets with the sizes len(s) and len(t), respectively. (In English: the time is bounded by and linear in the size of the smaller set.)
Note: based on the comments below, this wiki entry had been be wrong if the argument passed is not a set. I've corrected the wiki entry.
If you have n sets (sets, not iterables), you'll do n-1 intersections and the time can be (n-1)O(len(s)) where s is the set with the smallest size.
Note that as you do an intersection the result may get smaller, so although O is the worst case, in practice, the time will be better than this.
However, looking at the specific code this idea of taking the min() only applies to a single pair of sets and doesn't extend to multiple sets. So in this case, we have to be pessimistic and take s as the set with the largest size.
Answer from rocky on Stack Overflowalgorithms - Time complexity of set intersection - Computer Science Stack Exchange
Time complexity of python set operations?
What's the fastest algorithm and data structure for set operations like intersection?
Data structure optimized for set intersection?
Videos
Like when getting common elements from two sets in python. What's the complexity in that case and how does it work? (Edited: so far found)
Your post contains two problems. I will only address the first.
There is a simple algorithm for computing
, which involves an array of length
.
Another algorithm sorts to find the intersection in time
. In contrast to the previous algorithm, this algorithm can be implemented using only comparisons. There is a matching
lower bound in the algebraic decision tree model; see for example lecture notes of Otfried Cheong.
You can’t give the time complexity because a set is not a primitive data structure, so you need to know how it is represented.
Why would n be part of the input size? Let A = { 1,000,000,000 } and B = { 1, 1,000,000,000 }, for example.