there is a very detailed table on python wiki which answers your question.

However, in your particular example you should use enumerate to get an index of an iterable within a loop. like so:

for i, item in enumerate(some_seq):
    bar(item, i)
Answer from SilentGhost on Stack Overflow
🌐
Python
wiki.python.org › moin › TimeComplexity
TimeComplexity - Python Wiki
Internally, a list is represented as an array; the largest costs come from growing beyond the current allocation size (because everything must move), or from inserting or deleting somewhere near the beginning (because everything after that must move).
Discussions

What is the time complexity of the “in” operation
This moved much faster. Because it does fewer tests - if you're checking for membership in a list, you can stop as soon as you find the element. If you're comparing every element of the list to a value, as the first example does, then you check every element of the list. Doing less is always faster. More on reddit.com
🌐 r/learnpython
10
2
September 1, 2021
Big O Cheat Sheet: the time complexities of operations Python's data structures
Good for people getting into programming in general. I only have one remark: I wouldn't qualify O(n) as "Slow !" since it's still practically fast for low values of n and has the elegance of scaling linearly, which is one of the best scenarios available in the vast amount of cases a programmer will face. More on reddit.com
🌐 r/Python
28
209
April 16, 2024
algorithm - Time Complexity of List creation in Python - Stack Overflow
Would the time complexity of these operations be O(N)? Because, you would need to iterate through all of the items up to N and append it to the list. Append is O(1). Therefore, it is N*O(1) = O(N). Thoughts? Thanks. ... Thanks everyone. Those threads suggest it is O(N), which is what I speculate. But they don't explain why. Can anyone find concrete proof why creating a list of size N will be O(N) in python... More on stackoverflow.com
🌐 stackoverflow.com
Why is the time complexity of Python's list.append() method O(1)? - Stack Overflow
As seen in the documentation for TimeComplexity, Python's list type is implemented using an array. So if an array is being used and we do a few appends, eventually you will have to reallocate space... More on stackoverflow.com
🌐 stackoverflow.com
🌐
GeeksforGeeks
geeksforgeeks.org › python › complexity-cheat-sheet-for-python-operations
Complexity Cheat Sheet for Python Operations - GeeksforGeeks
July 12, 2025 - This cheat sheet is designed to help developers understand the average and worst-case complexities of common operations for these data structures that help them write optimized and efficient code in Python. Python's list is an ordered, mutable sequence, often implemented as a dynamic array. Below are the time complexities for common list operations:
🌐
Reddit
reddit.com › r/learnpython › what is the time complexity of the “in” operation
r/learnpython on Reddit: What is the time complexity of the “in” operation
September 1, 2021 -

I’m not the biggest python user. But I was looking at a friends code yesterday and they had something like:

For x in (list of 40000)

For y in (list of 2.7 million)

  If x = y 

     Append something 

This was obviously super slow so they changed it to something like:

For x in (list of 2.7 million)

If y in (list of 40000)

  Append something 

This moved much faster. I get the point of one for loop being faster than two, but what is that “in” exists function doing that makes it so much faster. I always thought that to check if something exists is O(n) which shouldn’t be faster. Also this was for ML purposes so they were likely using numpy stuff.

🌐
Medium
medium.com › @ivanmarkeyev › understanding-python-list-operations-a-big-o-complexity-guide-49be9c00afb4
Understanding Python List Operations: A Big O Complexity Guide | by Ivan Markeev | Medium
June 4, 2023 - Appending one list to another in Python takes time proportional · to the length of the list being appended (k). The elements of the second list need to be copied to the first list, resulting in O(k) complexity.
🌐
Quora
quora.com › What-are-the-time-complexity-considerations-of-lists-in-Python
What are the time complexity considerations of lists in Python? - Quora
Answer: In a normal list on average: * Append : O(1) * Extend : O(k) - k is the length of the extension * Index : O(1) * Slice : O(k) * Sort : O(n log n) - n is the length of the list * Len : O(1) * Pop : O(1) - pop from end * Insert : O(n) ...
🌐
LabEx
labex.io › tutorials › python-what-is-the-time-complexity-of-list-append-and-remove-operations-in-python-397728
What is the time complexity of list append and remove operations in Python | LabEx
This is because the list.append() operation simply adds a new element to the end of the list, and the underlying implementation of the Python list data structure is designed to handle this operation efficiently. Here's an example code snippet to demonstrate the constant time complexity of the list.append() operation:
Find elsewhere
🌐
DEV Community
dev.to › iihsan › time-complexity-analysis-of-python-methods-bigo-notations-for-list-tuple-set-and-dictionary-methods-47l9
Time Complexity Analysis of Python Methods: Big(O) Notations for List, Tuple, Set, and Dictionary Methods - DEV Community
January 15, 2024 - For example, list.append(). As the list reserves some memory, so until it is utilized, list.append() gives O(1). However, when the reserved memory is filled, and new memory is required, a new list is created with more space, copying all elements. While this operation is not always constant time, it happens infrequently. So, we refer to it as Amortized constant time. The IN Operator uses linear search with a time complexity of O(n). Thanks for reading! Feel free to like, comment, and share if you find this article valuable. You can check my other articles as well: Mastering Metaclasses in Python using real-life scenarios Use Asynchronous Programming in Python: Don’t Block Entire Thread
🌐
Python Morsels
pythonmorsels.com › time-complexities
Python Big O: the time complexities of different data structures in Python - Python Morsels
April 16, 2024 - For example, sets are faster at key lookups than lists, but they have no ordering. Dictionaries are just as fast at key lookups as sets and they maintain item insertion order, but they require more memory. In day-to-day Python usage, time complexity tends to matter most for avoiding loops within loops.
🌐
GeeksforGeeks
geeksforgeeks.org › python › time-complexity-of-a-list-to-set-conversion-in-python
Time Complexity of A List to Set Conversion in Python - GeeksforGeeks
July 23, 2025 - The time to convert small list to set : 0.0 The set is : {1, 2, 3, 4, 5} The time to convert large list to set : 0.21737 · The time complexity of list to set conversion is O(n) where n is the number of element in the list.
🌐
DEV Community
dev.to › williams-37 › understanding-time-complexity-in-python-functions-5ehi
Understanding Time Complexity in Python Functions - DEV Community
October 25, 2024 - ... Removing an element (by value) requires searching for the element first, which takes linear time. ... Python’s built-in sorting algorithm (Timsort) has a time complexity of O(n log n) in the average and worst cases.
🌐
Bradfield CS
bradfieldcs.com › algos › analysis › performance-of-python-types
Performance of Python Types
However, the expansion rate is cleverly chosen to be three times the previous size of the array; when we spread the expansion cost over each additional append afforded by this extra space, the cost per append is ... O(1)O(1) on an amortized basis. ... Popping from a Python list is typically performed from the end but, by passing an index, you can pop from a specific position.
🌐
Stack Overflow
stackoverflow.com › questions › 65261092 › time-complexity-of-list-creation-in-python
algorithm - Time Complexity of List creation in Python - Stack Overflow
For example, if you already have a list that you pass as an argument, the upper bound for creating a copy or appending a list at the end of it might be O(1). But, for the general case where you append n elements to a list, assuming you have a pointer at the end of the list and there is no shifting of values to the left or to the right, then that indeed would be O(n). However, if you want to insert n elements at an arbitrary position in the list, which would require shifting, that again depends on what implementation you are dealing with, but it will generally be O(n^2) since either elements ne
🌐
Plain English
python.plainenglish.io › exploring-python-lists-understanding-methods-operations-and-time-complexity-66242073716a
“Exploring Python Lists: Understanding Methods, Operations, and Time Complexity” | by Ewho Ruth | Python in Plain English
December 5, 2024 - The time complexity for accessing an element in a list by index is O(1) Here are some common operations on lists in Python along with real-world examples and their time complexities:
🌐
YourBasic
yourbasic.org › algorithms › time-complexity-arrays
Time complexity of array/list operations [Java, Python] · YourBasic
Note: add(E element) takes constant amortized time, even though the worst-case time is linear. The following Python list operations operate on a subset of the elements, but still have time complexity that depends on n = len(a).
🌐
Medium
medium.com › @hamzaehsankhan › why-len-list-has-a-time-complexity-of-o-1-a-dive-into-cpython-cbed75c38b54
Why len(list) has a time-complexity of O(1)? — A dive into CPython | by Hamza Ehsan Khan | Medium
July 13, 2023 - This in-built function is not at the mercy of the size of the list. Whether your list contains 1 element or 1000, as per the default implementation of Python (CPython), the time-complexity is O(1).
🌐
Sololearn
sololearn.com › en › Discuss › 2409293 › time-complexity-of-listcount-in-python
Time complexity of list.count in Python | Sololearn: Learn to code for FREE!
What is the time complexity? ... Bagon I've done some research in the CPython code and found that ob_size is no longer part of the PyListObject struct so it has to iterate over the list to count the elements. This means that at least CPython has O(n) for counting list elements.
Top answer
1 of 3
206

It's amortized O(1), not O(1).

Let's say the list reserved size is 8 elements and it doubles in size when space runs out. You want to push 50 elements.

The first 8 elements push in O(1). The nineth triggers reallocation and 8 copies, followed by an O(1) push. The next 7 push in O(1). The seventeenth triggers reallocation and 16 copies, followed by an O(1) push. The next 15 push in O(1). The thirty-third triggers reallocation and 32 copies, followed by an O(1) push. The next 31 push in O(1). This continues as the size of list is doubled again at pushing the 65th, 129th, 257th element, etc..

So all of the pushes have O(1) complexity, we had 64 copies at O(1), and 3 reallocations at O(n), with n = 8, 16, and 32. Note that this is a geometric series and asymptotically equals O(n) with n = the final size of the list. That means the whole operation of pushing n objects onto the list is O(n). If we amortize that per element, it's O(n)/n = O(1).

2 of 3
61

If you look at the footnote in the document you linked, you can see that they include a caveat:

These operations rely on the "Amortized" part of "Amortized Worst Case". Individual actions may take surprisingly long, depending on the history of the container.

Using amortized analysis, even if we have to occasionally perform expensive operations, we can get a lower bound on the 'average' cost of operations when you consider them as a sequence, instead of individually.

So, any individual operation could be very expensive - O(n) or O(n^2) or something even bigger - but since we know these operations are rare, we guarantee that a sequence of O(n) operations can be done in O(n) time.