It's O(1) for both list and tuple. They are both morally equivalent to an integer indexed array.

Answer from David Heffernan on Stack Overflow
🌐
DEV Community
dev.to › iihsan › time-complexity-analysis-of-python-methods-bigo-notations-for-list-tuple-set-and-dictionary-methods-47l9
Time Complexity Analysis of Python Methods: Big(O) Notations for List, Tuple, Set, and Dictionary Methods - DEV Community
January 15, 2024 - Whether you're working on real-world ... writing efficient and scalable code. So, understanding the time complexity of your code becomes essential. In this article, we'll break down the Python methods for lists, tuples, sets, and dictionaries....
Discussions

Hashing Time Complexity in Python
Yes, hashing takes O(k) where k is the length of the string. I believe it's the same for a tuple. That makes your code O(nk). However, Python strings are immutable so the hash code gets cached on the string object (at least in CPython; might be interpreter-dependent). That means that if you look up the string again the hash doesn't need to be recalculated. To check for collisions the lookup key has to be compared with the key in the hash table. This is also O(k) for a string. But again there's an optimization. Python does string interning where it caches short strings in a table. If you ask for a string "abc" and one already exists in the table it'll just return the same string. So Python can compare references in O(1) time instead of comparing string values in O(k) time. Edit: It looks like CPython doesn't cache tuple hash codes since it was found to not be worth the cost. See the source code , particularly line 316. According to one comment by Tim Peters in the discussion , dicts also remember the hash codes of keys, regardless of whether the key object has cached the hash code. tl;dr O(k) hashing but it's complicated. More on reddit.com
🌐 r/leetcode
4
13
April 2, 2023
Doubt on analysis on time and space complexity of creating n² tuples - Computer Science Stack Exchange
I saw a related post on strings instead of tuples, in this case time= 1+2+…n=n^2, which supports my inference. https://stackoverflow.com/questions/37133547/time-complexity-of-string-concatenation-in-python More on cs.stackexchange.com
🌐 cs.stackexchange.com
March 28, 2018
Time complexity of casting lists to tuples in python and vice versa - Stack Overflow
Well, the original list could still be modified, so don't think it would work to simply switch from writable to read-only. But you can look at the source and find out. Or you can time it and see if it follows O(N). ... It is an O(N) operation, tuple(list) simply copies the objects from the ... More on stackoverflow.com
🌐 stackoverflow.com
python - Time complexity of list and tuple lookup - Stack Overflow
tuple and list both have constant time O(1) for searching an element, but is it a different constant time for each? Let's say I have two variables list1 = ["hello", "how", "are", "you"] tuple1 = (" More on stackoverflow.com
🌐 stackoverflow.com
🌐
Python Like You Mean It
pythonlikeyoumeanit.com › Module2_EssentialsOfPython › DataStructures.html
Data Structures (Part I): Introduction — Python Like You Mean It
We will be using the “big-O” notation, \(\mathcal{O}(f(n))\), to summarize the performance of the algorithms used by Python’s data structures. The “Sequence Types” section already introduced lists and tuples. Recall that both provide the same interface for accessing and summarizing the contents of a heterogeneous sequence of objects. However, a list can be mutated - updated, removed from, and added to - whereas a tuple cannot be mutated. Thus a list is mutable, whereas a tuple is immutable. Here you will find a summary of the algorithmic complexities of many of the built-in functions that work on sequential data structures.
🌐
Quora
quora.com › When-we-use-in-and-not-in-for-ensuring-existence-of-an-element-in-list-string-tuple-set-My-question-is-whats-the-actual-time-complexity-for-these-two-operations
When we use 'in' and 'not in' for ensuring existence of an element in list/string/tuple/set. My question is : what's the actual time complexity for these two operations? - Quora
Since a string is really just an array, this goes for strings too. Linear time. The “in” operation is normally only defined on tuples in a dynamic language. In that case, a tuple is again very much like an array, and so you get linear complexity.
🌐
Reddit
reddit.com › r/leetcode › hashing time complexity in python
r/leetcode on Reddit: Hashing Time Complexity in Python
April 2, 2023 -

When we use strings and tuples as python dictionary keys, is the time complexity of accessing a particular dictionary item (using dictionary[key]) O(n), where n is the length of the string or tuple?

For example, in the code below if n is the length of the input list and k is the maximum length of a string in the input list, is the time complexity O(nk) or O(n)?

def groupStrings(self, strs: List[str]) -> List[List[str]]
    groups = []
    groupDict = {}

    for string in strs:
        if string in groupDict:
            groups[groupDict[string]].append(string)
        else:
            groupDict[sortedStr] = len(groups)
            groups.append([string])

    return groups
🌐
Plain English
python.plainenglish.io › python-lists-and-tuples-760d45ebeaa8
Python Lists vs Tuples: What’s the Difference? | by Mayur Jain | MLWorks | Medium
June 26, 2025 - List and tuples comes under a class ... as the element itself. Because given a position or index, it takes O(1) time complexity to find an element....
Find elsewhere
🌐
Real Python
realpython.com › python-tuple
Python's tuple Data Type: A Deep Dive With Examples – Real Python
November 6, 2023 - Internally, tuples keep track of their length, so calling len() with a tuple as an argument is a fast operation with a time complexity of O(1). You may need to compare tuples at some point in your coding journey. Fortunately, tuples support the standard comparison operators. When you compare two tuples, Python uses lexicographical ordering.
🌐
Index.dev
index.dev › blog › python-tuple-methods-operations-guide
Python Tuple Methods and Operations: A Practical Guide ...
August 22, 2024 - Repetition also constitutes a new tuple. Time complexity of repetition is equal to the number of times it repeats, multiplied by the length of the tuple.
🌐
Python
wiki.python.org › moin › TimeComplexity
TimeComplexity - Python Wiki
As seen in the source code the complexities for set difference s-t or s.difference(t) (set_difference()) and in-place set difference s.difference_update(t) (set_difference_update_internal()) are different! The first one is O(len(s)) (for every element in s add it to the new set, if not in t).
🌐
Stack Exchange
cs.stackexchange.com › questions › 89849 › doubt-on-analysis-on-time-and-space-complexity-of-creating-n²-tuples
Doubt on analysis on time and space complexity of creating n² tuples - Computer Science Stack Exchange
March 28, 2018 - I saw a related post on strings instead of tuples, in this case time= 1+2+…n=n^2, which supports my inference. https://stackoverflow.com/questions/37133547/time-complexity-of-string-concatenation-in-python
Top answer
1 of 2
16

It is an O(N) operation, tuple(list) simply copies the objects from the list to the tuple. SO, you can still modify the internal objects(if they are mutable) but you can't add new items to the tuple.

Copying a list takes O(N) time.

>>> tup = ([1, 2, 3],4,5 ,6)
>>> [id(x) for x in tup]
[167320364, 161878716, 161878704, 161878692]
>>> lis = list(tup)

Internal object still refer to the same objects

>>> [id(x) for x in lis]
[167320364, 161878716, 161878704, 161878692]

But outer containers are now different objects. So, modifying the outer objects won't affect others.

>>> tup is lis
False
>>> lis.append(10)
>>> lis, tup
([[1, 2, 3], 4, 5, 6, 10], ([1, 2, 3], 4, 5, 6)) #10 not added in tup

Modifying a mutable internal object will affect both containers:

>>> tup[0].append(100)
>>> tup[0], lis[0]
([1, 2, 3, 100], [1, 2, 3, 100])

Timing comparison suggest list copying and tuple creation take almost equal time, but as creating a new object with new properties has it's overhead so tuple creation is slightly expensive.

>>> lis = range(100)
>>> %timeit lis[:]
1000000 loops, best of 3: 1.22 us per loop
>>> %timeit tuple(lis)
1000000 loops, best of 3: 1.7 us per loop
>>> lis = range(10**5)
>>> %timeit lis[:]
100 loops, best of 3: 2.66 ms per loop
>>> %timeit tuple(lis)
100 loops, best of 3: 2.77 ms per loop
2 of 2
3

As far as I understand, there's no bit to switch, as the list (mutable) object is completely different than the tuple (immutable) object. They have different methods etc.

One experiment you might do is this:

>>> a = [1,2,3,4,5]
>>> a = (1,2,3,4,5)
>>> a
(1, 2, 3, 4, 5)
>>> b = list(a)
>>> b
[1, 2, 3, 4, 5]
>>> b[2] = 'a'
>>> b
[1, 2, 'a', 4, 5]
>>> a
(1, 2, 3, 4, 5)

See, if they were referencing the exact place in the memory then a should have changed as well.

That's for my understanding.

🌐
Stack Overflow
stackoverflow.com › questions › 32779841 › time-complexity-of-list-and-tuple-lookup
python - Time complexity of list and tuple lookup - Stack Overflow
tuple and list both have constant time O(1) for searching an element, but is it a different constant time for each? Let's say I have two variables list1 = ["hello", "how", "are", "you"] tuple1 = ("
🌐
GeeksforGeeks
geeksforgeeks.org › python › access-front-and-rear-element-of-python-tuple
Access front and rear element of Python tuple - GeeksforGeeks
April 12, 2023 - Time Complexity: O(1) Auxiliary Space: O(n), where n is the length of the tuple. This is because the *_ syntax creates a new list with length n-2, which takes up memory space. ... This program demonstrates how to access the first and last elements ...
🌐
GitHub
gist.github.com › Gr1N › 60b346b5e91babb5efac
Complexity of Python Operations · GitHub
Complexity of Python Operations. GitHub Gist: instantly share code, notes, and snippets.
Top answer
1 of 1
4

TLDR: Your time complexities are correct, though your O(n^3) space for the recursive gen_seq is too pessimistic (it is still significantly more wasteful). Note that the optimal static solution is O(n^2), since that is the size of the answer. If no static answer is required, space complexity can be lowered to O(1).


Let's start by establishing some basic complexities. The below applies to both time and space complexity:

  • Creating a character literal is O(1).
  • Creating a tuple of size n is O(n).
    • Creating an empty or single-element tuple is O(1).
  • Concatenating two tuples of length n and m is O(n+m).
    • Concatenating two tuples of length n^2 and m, it is O(n^2+m) = O(n^2).

Iteration:

def gen_seq(n): # iterative
    seq = ('F',)
    for i in range(2, n+1):
        side = ('T',) + (i-1)*('F',)  # note: `i-1` instead of `n-1`
        seq += side + side + ('F',)
    return seq

Key points for complexity are:

  • The range(const, n+1) loop is O(n) time complexity, and O(1) space.
  • The side is constructed anew at a size of n for i->n. Space is reused, for a maximum of O(n) space. Time is consumed on all n iterations, for O(n*n) = O(n^2) time.
  • The seq is concatenated with an n-tuple on all n iterations. Space is reused, for a maximum of O(n*n) = O(n^2) space. Time is consumed on all n iterations, for O(n*n^2) = O(n^3) time.

The largest complexity wins, so iteration uses O(n^2) space and O(n^3) time.


Recursion:

def gen_seq(n): # recursive
    if n == 1:
        return ('F',)
    else:
        side = ('T',) + (n-1)*('F',)
        return gen_seq(n-1) + side + side + ('F',)

Key points for complexity are:

  • Recursion is repeated from n->1, meaning O(n) time.
  • The side is constructed anew at a size of n. Space is not reused, since each side is constructed before recursion, for a maximum of O(n*n) = O(n^2) space. Time is consumed on all n iterations, for O(n*n) = O(n^2) time.
  • The return value is concatenated with an n-tuple on all n iterations. Space is reused, since each return value is constructed after recursion, for a maximum of O(n*n) = O(n^2) space. Time is consumed on all n iterations, for O(n*n^2) = O(n^3) time.

The largest complexity wins, so iteration uses O(n^2) space and O(n^3) time.


The limit for your time complexity is that the result of each step must be repeated in the next. In Python, this can be circumvented using a generator - this allows you to return intermediate results and proceed with generating more results:

def gen_seq(n):
    yield 'F'
    for i in range(1, n):
        yield 'T'
        yield from ('F' for _ in range(i))
        yield 'T'
        yield from ('F' for _ in range(i))

seq = tuple(gen_seq(m))

Key points for complexity are:

  • The range(n) loop is O(n) time complexity, and O(1) space.
  • The yield from ... range(i) loop is O(n) time, and O(1) space. Space reuse leaves this at O(1) space. Repetition by n times gives O(n*n) = O(n^2) time.
  • Concatenating all results at once via tuple is O(n^2 * 1) = O(n^2) space.

The largest complexity wins, so iteration uses O(n^2) space and O(n^2) time. If the result is not stored but directly used, only O(1) space is used.

🌐
DEV Community
dev.to › global_codess › time-complexities-of-python-data-structures-3bja
Time Complexities Of Python Data Structures - DEV Community
February 19, 2020 - For example, a linear search’s ... Tuples support all operations that do not mutate the data structure (and they have the same complexity classes)....