algorithm - What is the time complexity of the merge step of mergesort? - Stack Overflow
Trying to understand the time complexity of merge sort issue
My Merge Sort gives me a stack overflow
Why is the worst case space complexity of Quicksort O(n)?
How does the merge sort algorithm work?
Is merge sort a stable sorting algorithm?
What are the practical applications of merge sort?
Videos
It's O(n * log(n)), not O(log(n)). As you've accurately surmised, the entire input must be iterated through, and this must occur O(log(n)) times (the input can only be halved O(log(n)) times). n items iterated log(n) times gives O(n log(n)).
It's been proven that no comparison sort can operate faster than this. Only sorts that rely on a special property of the input such as radix sort can beat this complexity. The constant factors of mergesort are typically not that great though so algorithms with worse complexity can often take less time.
The complexity of merge sort is O(nlog(n)) and NOT O(log(n)).
Merge sort is a divide and conquer algorithm. Think of it in terms of 3 steps:
- The divide step computes the midpoint of each of the sub-arrays. Each of this step just takes O(1) time.
- The conquer step recursively sorts two subarrays of n/2 (for even n) elements each.
- The merge step merges n elements which takes O(n) time.
Now, for steps 1 and 3 i.e. between O(1) and O(n), O(n) is higher. Let's consider steps 1 and 3 take O(n) time in total. Say it is cn for some constant c.
How many times are we subdividing the original array?
We subdivide the input until each sub-array has one item so there are exactly log(n) "subdivision stages". We undo each subdivision stage with a "merge stage".
For example, if n = 8, there is a total of 3 merge stages: one where each pair of n/8 sub-arrays are merged to form a single n/4 sub-array, one where pairs of n/4s are merged to form n/2s, and one where the pair of n/2 are merged to form n.
What is the time cost for merging all pairs at each merge stage?
For this, look at the tree below - for each level from top to bottom:
- Level 2 calls merge method on 2 sub-arrays of length n/2 each. The complexity here is 2 * (cn/2) = cn.
- Level 3 calls merge method on 4 sub-arrays of length n/4 each. The complexity here is 4 * (cn/4) = cn.
- and so on ...
At each merge stage, the total cost for merging all pairs is O(cn). Since there are log(n) merge stages, the total complexity is: (cost per stage)*(number of stages) = (cn)*(log(n)) or O(nlog(n)).

Image credits: Khan Academy
The "split" step is the one that takes o(logn), and the merge one is o(n), just realized that via a comment.
The split step of Merge Sort will take O(n) instead of O(log(n)). If we have the runtime function of split step:
T(n) = 2T(n/2) + O(1)
with T(n) is the runtime for input size n, 2 is the number of new problems and n/2 is the size of each new problem, O(1) is the constant time to split an array in half.
We also has the base case: T(4) = O(1) and T(3) = O(1) .
We might come up with (not really accurate):
T(n) = n/2 * O(1) = O(n/2) = O(n)
Moreover, to understand the time complexity of Merge step (finger algorithm), we should understand the number of sub-array.
The number of sub-array has the asymptotic growth rate at the worst case = O(n/2 + 1) = O(n).
The "Finger Algorithm" grow linear with the growth of number of sub-array, it will loop through each sub-array O(n), and at each sub-array at the worst case it will need to loop 2 more times -> the time complexity of merge step (finger algorithm) = O(2n) = O(n).
