You are creating a new list object each time by concatenating. This requires copying all elements from the old list into a new one, plus one extra. So yes, using l = l + [i] is an O(N) algorithm, not O(1).
At the very least, don't use + concatenation; use += augmented concatenation, which is the same thing as list.extend() with a re-assignment to the same reference:
def test3():
l = []
for i in range(1000):
l += [i] # or use l.extend([i])
return l
This produces:
>>> print timeit.repeat(stmt=test1, number=100, repeat=2)
[0.1333179473876953, 0.12804388999938965]
>>> print timeit.repeat(stmt=test2, number=100, repeat=2)
[0.01052403450012207, 0.007989168167114258]
>>> print timeit.repeat(stmt=test3, number=100, repeat=2)
[0.013209104537963867, 0.011193037033081055]
Answer from Martijn Pieters on Stack Overflowpython - Time complexity of min() and max() on a list of constant size? - Computer Science Stack Exchange
What is Python's list.append() method WORST Time Complexity? It can't be O(1), right?
Runtime of a string join?
Python String Addition Time Complexity
I know that lists in Python are implemented using arrays that store addresses to the information. Therefore, after several appends, when an array is loaded, it needs to reserve a new space and copy the entire array of addresses to the new place.
I've read on Stackoverflow that in Python Array doubles in size when run of space. So basically it has to copy all addresses log(n) times.
The bigger the list, the more copying it will need to do. So how can append operation have a Constant Time Complexity O(1) if it has some dependence on the array size
I assume since it copies addresses, not information, it shouldn't take long, python takes only 8 bytes for address after all. Moreover, it does so very rarely. Does that mean that O(1) is the average time complexity? Is my assumption right?
That depends what exactly you mean by "constant sized". The time to find the minimum of a list with 917,340 elements is $O(1)$ with a very large constant factor. The time to find the minimum of various lists of different constant sizes is $O(n)$ and likely $\Theta(n)$ where $n$ is the size of each list. Finding the minimum of a list of 917,340 elements takes much longer than finding the minimum of a list of 3 elements.
I found this quote from the Wikipedia article on time complexity helpful:
The time complexity is generally expressed as a function of the size of the input.
So if the size of the input doesn't vary, for example if every list is of 256 integers, the time complexity will also not vary and the time complexity is therefore O(1). This would be true of any algorithm, such as sorting, searching, etc.