It's O(n * log(n)), not O(log(n)). As you've accurately surmised, the entire input must be iterated through, and this must occur O(log(n)) times (the input can only be halved O(log(n)) times). n items iterated log(n) times gives O(n log(n)).
It's been proven that no comparison sort can operate faster than this. Only sorts that rely on a special property of the input such as radix sort can beat this complexity. The constant factors of mergesort are typically not that great though so algorithms with worse complexity can often take less time.
Answer from DeadMG on Stack Exchangealgorithms - Why is mergesort O(log n)? - Software Engineering Stack Exchange
Trying to understand the time complexity of merge sort issue
My Merge Sort gives me a stack overflow
Why is the worst case space complexity of Quicksort O(n)?
How does the merge sort algorithm work?
Is merge sort a stable sorting algorithm?
What are the practical applications of merge sort?
Videos
It's O(n * log(n)), not O(log(n)). As you've accurately surmised, the entire input must be iterated through, and this must occur O(log(n)) times (the input can only be halved O(log(n)) times). n items iterated log(n) times gives O(n log(n)).
It's been proven that no comparison sort can operate faster than this. Only sorts that rely on a special property of the input such as radix sort can beat this complexity. The constant factors of mergesort are typically not that great though so algorithms with worse complexity can often take less time.
The complexity of merge sort is O(nlog(n)) and NOT O(log(n)).
Merge sort is a divide and conquer algorithm. Think of it in terms of 3 steps:
- The divide step computes the midpoint of each of the sub-arrays. Each of this step just takes O(1) time.
- The conquer step recursively sorts two subarrays of n/2 (for even n) elements each.
- The merge step merges n elements which takes O(n) time.
Now, for steps 1 and 3 i.e. between O(1) and O(n), O(n) is higher. Let's consider steps 1 and 3 take O(n) time in total. Say it is cn for some constant c.
How many times are we subdividing the original array?
We subdivide the input until each sub-array has one item so there are exactly log(n) "subdivision stages". We undo each subdivision stage with a "merge stage".
For example, if n = 8, there is a total of 3 merge stages: one where each pair of n/8 sub-arrays are merged to form a single n/4 sub-array, one where pairs of n/4s are merged to form n/2s, and one where the pair of n/2 are merged to form n.
What is the time cost for merging all pairs at each merge stage?
For this, look at the tree below - for each level from top to bottom:
- Level 2 calls merge method on 2 sub-arrays of length n/2 each. The complexity here is 2 * (cn/2) = cn.
- Level 3 calls merge method on 4 sub-arrays of length n/4 each. The complexity here is 4 * (cn/4) = cn.
- and so on ...
At each merge stage, the total cost for merging all pairs is O(cn). Since there are log(n) merge stages, the total complexity is: (cost per stage)*(number of stages) = (cn)*(log(n)) or O(nlog(n)).

Image credits: Khan Academy