It's O(n * log(n)), not O(log(n)). As you've accurately surmised, the entire input must be iterated through, and this must occur O(log(n)) times (the input can only be halved O(log(n)) times). n items iterated log(n) times gives O(n log(n)).

It's been proven that no comparison sort can operate faster than this. Only sorts that rely on a special property of the input such as radix sort can beat this complexity. The constant factors of mergesort are typically not that great though so algorithms with worse complexity can often take less time.

Answer from DeadMG on Stack Exchange
🌐
VisuAlgo
visualgo.net › en › sorting
Sorting (Bubble, Selection, Insertion, Merge, Quick, Counting, Radix) - VisuAlgo
Merge Sort. When that happens, the depth of recursion is only O(log N). As each level takes O(N) comparisons, the time complexity is O(N log N).
🌐
Studocu
studocu.com › massachusetts college of art and design › computer science › time complexity analysis of merge sort algorithm in data science
Time Complexity Analysis of Merge Sort Algorithm in Data Science - Studocu
September 27, 2024 - This process involves comparing ... O(n). ... In Merge Sort, the best, average, and worst-case time complexities are all O(n log n)....
Discussions

algorithms - Why is mergesort O(log n)? - Software Engineering Stack Exchange
IMO it makes more sense to count ... At each "merge stage", a total of (cn) work is being performed (as Shantanu has explained). So the total cost across all stages is: (cn)*(log(n)). ... Merge Sort is a recursive algorithm and time complexity can be expressed as following recurrence ... More on softwareengineering.stackexchange.com
🌐 softwareengineering.stackexchange.com
Trying to understand the time complexity of merge sort issue
We merge two lists of length N/2, which takes time N. In order to do that, we had to merge 4 lists of length N/4, which takes time N. In order to do that, we had to merge 8 lists of length N/8, which takes time N. At the very beginning we merged N lists of length 1. So each time we're doubling the number of lists, until we get to N. The number of times you have to double 1 until you get to N is log_2 (N). So that's where the log_2 comes from. Maybe that helps? More on reddit.com
🌐 r/algorithms
6
3
August 25, 2020
My Merge Sort gives me a stack overflow
You're missing parentheses. Try `let m = (l + r) / 2`. More on reddit.com
🌐 r/rust
12
4
October 25, 2020
Why is the worst case space complexity of Quicksort O(n)?
Quicksort is a recursive algorithm in its nature. The space complexity is not coming from needing an additional array (the array can get partitioned in-place), but from the call stack of each recursive function call. In the worst case, that is, the array is already sorted in reverse order, the intuitive implementation of quicksort will call itself n times, thereby needing an O(n) large call stack. Note that there is a method that can achieve O(log n) space complexity even in the worst case, but the reason for the space complexity is the same -- it comes from the call stack, and that method uses a clever trick to limit the number of recursion calls. More on reddit.com
🌐 r/ECE
8
6
January 27, 2024
People also ask

How does the merge sort algorithm work?
Merge sort is a divide-and-conquer algorithm that splits an array into halves, recursively sorts each half, and merges the sorted halves back together. It repeatedly divides arrays until subarrays of size one are achieved, then combines them in sorted order, resulting in a fully sorted array.
🌐
vaia.com
vaia.com › merge sort
Merge Sort: Algorithm & Time Complexity | Vaia
Is merge sort a stable sorting algorithm?
Yes, merge sort is a stable sorting algorithm because it preserves the relative order of equal elements in the input array. This is achieved by ensuring that when merging two halves of the array, elements from the left half are placed before equal elements from the right half.
🌐
vaia.com
vaia.com › merge sort
Merge Sort: Algorithm & Time Complexity | Vaia
What are the practical applications of merge sort?
Merge Sort is useful in scenarios where stability is crucial and when sorting linked lists due to its non-reliance on random access to data. It's often employed in external sorting algorithms, like sorting large datasets that don't fit in memory as it efficiently handles disk-based storage.
🌐
vaia.com
vaia.com › merge sort
Merge Sort: Algorithm & Time Complexity | Vaia
🌐
Vaia
vaia.com › merge sort
Merge Sort: Algorithm & Time Complexity | Vaia
August 4, 2023 - Let's delve deeper into the computational aspects of Merge Sort: The time complexity of Merge Sort is consistently O(n log n), making it ideal for data-intensive tasks where efficiency matters.
🌐
Medium
medium.com › @truongnhukhang1993 › why-merge-sorts-time-complexity-is-o-nlogn-explain-by-mathematic-689adb8b519b
Why merge sort’s time complexity is O(NlogN)? Explain by mathematic | by Truong Nhu Khang | Medium
March 28, 2020 - Why merge sort’s time complexity is O(NlogN)? Explain by mathematic In this article, I will explain why merge sort time complexity is O(nLogn) by mathematic. Analyze the merge sort The idea of …
Find elsewhere
🌐
Hero Vired
herovired.com › learning-hub › blogs › space-complexity-of-merge-sort
Time and Space Complexity of Merge Sort - Hero Vired
June 27, 2024 - Even in this worst case, the time complexity remains O(n log n). The key difference lies in the number of comparisons and swaps required to merge the sub-arrays. Merge Sort is known for its efficient time complexity, but it also requires additional ...
🌐
Alma Better
almabetter.com › bytes › articles › merge-sort-time-complexity
What is the Time Complexity of Merge Sort Algorithm?
June 12, 2024 - This process is repeated recursively until the entire array is sorted. Merge Sort has an average and worst-case time complexity of O(n log n), making it a reliable choice for sorting large datasets.
🌐
Baeldung
baeldung.com › home › algorithms › sorting › when will the worst case of merge sort occur?
When Will the Worst Case of Merge Sort Occur? | Baeldung on Computer Science
March 18, 2024 - Step 2 of the algorithm includes “Merge + Sort”, where two subarrays are merged so that a sorted array is created from each pair of subarrays. In the last step, the two halves of the original array are merged so that the complete array is sorted: This algorithm loops through times and the time complexity of every loop is , so the time complexity of the entire function is .
🌐
Quora
quora.com › What-is-the-time-complexity-of-Merge-Sort-and-why-does-it-have-this-complexity
What is the time complexity of Merge Sort and why does it have this complexity? - Quora
Answer (1 of 2): The split step of Merge Sort will take O(n) instead of O(log(n)). If we have the runtime function of split step: T(n) = 2T(n/2) + O(1) with T(n) is the runtime for input size n, 2 is the number of new problems and n/2 is the ...
Top answer
1 of 5
74

It's O(n * log(n)), not O(log(n)). As you've accurately surmised, the entire input must be iterated through, and this must occur O(log(n)) times (the input can only be halved O(log(n)) times). n items iterated log(n) times gives O(n log(n)).

It's been proven that no comparison sort can operate faster than this. Only sorts that rely on a special property of the input such as radix sort can beat this complexity. The constant factors of mergesort are typically not that great though so algorithms with worse complexity can often take less time.

2 of 5
49

The complexity of merge sort is O(nlog(n)) and NOT O(log(n)).

Merge sort is a divide and conquer algorithm. Think of it in terms of 3 steps:

  1. The divide step computes the midpoint of each of the sub-arrays. Each of this step just takes O(1) time.
  2. The conquer step recursively sorts two subarrays of n/2 (for even n) elements each.
  3. The merge step merges n elements which takes O(n) time.

Now, for steps 1 and 3 i.e. between O(1) and O(n), O(n) is higher. Let's consider steps 1 and 3 take O(n) time in total. Say it is cn for some constant c.

How many times are we subdividing the original array?

We subdivide the input until each sub-array has one item so there are exactly log(n) "subdivision stages". We undo each subdivision stage with a "merge stage".

For example, if n = 8, there is a total of 3 merge stages: one where each pair of n/8 sub-arrays are merged to form a single n/4 sub-array, one where pairs of n/4s are merged to form n/2s, and one where the pair of n/2 are merged to form n.

What is the time cost for merging all pairs at each merge stage?

For this, look at the tree below - for each level from top to bottom:

  • Level 2 calls merge method on 2 sub-arrays of length n/2 each. The complexity here is 2 * (cn/2) = cn.
  • Level 3 calls merge method on 4 sub-arrays of length n/4 each. The complexity here is 4 * (cn/4) = cn.
  • and so on ...

At each merge stage, the total cost for merging all pairs is O(cn). Since there are log(n) merge stages, the total complexity is: (cost per stage)*(number of stages) = (cn)*(log(n)) or O(nlog(n)).

Image credits: Khan Academy

🌐
OpenGenus
iq.opengenus.org › time-and-space-complexity-of-merge-sort-on-linked-list
Time and Space Complexity of Merge Sort on Linked List
July 4, 2022 - The general equation that we get ... substituting (3) through (5) into (7), we eliminate 'k'. Hence the worst case time complexity of merge sort is O(n*log n)....
🌐
Enjoy Algorithms
enjoyalgorithms.com › blog › merge-sort-algorithm
Merge Sort Algorithm
Overall time complexity = O(1)+ O(n) + O(n) + O(n1) + O(n2) = O(n). If we observe closely, time complexity depends on the complexity of the merging loop where comparison, assignment, and increment are critical operations. There could be two different perspectives to understand this: Perspective ...
🌐
University of Maryland
math.umd.edu › ~immortal › CMSC351 › notes › mergesort.pdf pdf
CMSC 351: MergeSort Justin Wyss-Gallifent October 1, 2024 1
(sub)list is divided in half. Any green element or group of elements are sorted. All the action above the center line is the recursive deconstruction while all the · action below the center line is the re-merging of the sublists.
🌐
Youcademy
youcademy.org › merge-sort-time-space-complexity
Time and Space Complexity of Merge Sort
Merge Sort has a time complexity of O(n log n) in all cases: best, average, and worst. This makes it highly efficient compared to algorithms like Bubble Sort (O(n²)) for large datasets.
🌐
Programiz PRO
programiz.pro › resources › dsa-merge-sort-complexity
Exploring time and space complexity of Merge sort
December 19, 2024 - Merge Sort is a comparison-based divide-and-conquer sorting algorithm that works by recursively dividing the array into halves, sorting each half, and then merging them back together. It consistently performs with a time complexity of O(n log n) in the best, worst, and average cases.
🌐
HappyCoders.eu
happycoders.eu › algorithms › merge-sort
Merge Sort – Algorithm, Source Code, Time Complexity
June 12, 2025 - We have now executed the merge ... of previously O(n). The total complexity of the sorting algorithm is, therefore, O(n² log n) – instead of O(n log n)....
🌐
NVIDIA Developer
developer.nvidia.com › blog › merge-sort-explained-a-data-scientists-algorithm-guide
Merge Sort Explained: A Data Scientist’s Algorithm Guide | NVIDIA Technical Blog
June 12, 2023 - The time complexity of the merge sort algorithm remains O(n log n) for best, worst, and average scenarios, making it suitable for sorting large lists and linked lists where stability is important.
🌐
Medium
tarunjain07.medium.com › merge-sort-complexity-analysis-notes-b48426aa8d53
Merge sort Complexity analysis — [Notes] | by Tarun Jain | Medium
July 23, 2023 - Furthermore, in such systems, memory ... of merge sort’s space complexity is minor. One of the variants of merge sort is 3-way merge sort, which divides the array into three equal sections rather than two. Think! The crucial question is what the time and space complexity of a ...