worst-case optimal stable divide and conquer comparison sorting algorithm
Merge-sort-example-300px.gif
In computer science, merge sort (also commonly spelled as mergesort or merge-sort) is an efficient and general purpose comparison-based sorting algorithm. Most implementations of merge sort are stable, which means that the … Wikipedia
Factsheet
Data structure Array
Worst-case performance
Factsheet
Data structure Array
Worst-case performance
🌐
Wikipedia
en.wikipedia.org › wiki › Merge_sort
Merge sort - Wikipedia
1 week ago - In the worst case, merge sort uses approximately 39% fewer comparisons than quicksort does in its average case, and in terms of moves, merge sort's worst case complexity is O(n log n) - the same complexity as quicksort's best case.
Discussions

algorithm - What is the time complexity of the merge step of mergesort? - Stack Overflow
I know this algorithm has a time complexity of o(nlogn), but if we speak about only the merge step, is this one still o(nlogn)? Or is it reduced to o(logn)? I believe the second is the answer but s... More on stackoverflow.com
🌐 stackoverflow.com
Understanding merge sort Big O complexity
Splitting the array doesn't meaningfully change the runtime. To be specific, "splitting the array" is actually the act of determining the midpoint and then making the two recursive calls. It takes only constant time to determine where the middle of the array is ((low + mid) / 2). This doesn't change mergesort's runtime. A simple proof is that every "split" of the array is accompanied by a corresponding "merge" operation (since if you split the array into two pieces, you'll have to merge it back). So as an accounting trick, you can just book the cost of the split into the corresponding merge ==> this effectively makes splitting the array free, while making the merge operation slightly more expensive. (However, the merge operation will still be O(n), which is why it doesn't affect anything.) More on reddit.com
🌐 r/learnprogramming
3
4
June 26, 2022
Trying to understand the time complexity of merge sort issue
We merge two lists of length N/2, which takes time N. In order to do that, we had to merge 4 lists of length N/4, which takes time N. In order to do that, we had to merge 8 lists of length N/8, which takes time N. At the very beginning we merged N lists of length 1. So each time we're doubling the number of lists, until we get to N. The number of times you have to double 1 until you get to N is log_2 (N). So that's where the log_2 comes from. Maybe that helps? More on reddit.com
🌐 r/algorithms
6
3
August 25, 2020
My Merge Sort gives me a stack overflow
You're missing parentheses. Try `let m = (l + r) / 2`. More on reddit.com
🌐 r/rust
12
4
October 25, 2020
🌐
Reddit
reddit.com › r/learnprogramming › understanding merge sort big o complexity
r/learnprogramming on Reddit: Understanding merge sort Big O complexity
June 26, 2022 -

I'm going to be referring to this image. So the Big O of merge sort is nlogn.

So in the example I posted above, n = 8 so the Big O should be 8log(8) = 16. I think it's because in the first green level, we go through 8 items then merge and we do the same thing for the second green level so 8+8 = 16. But then I thought when we split the initial array(the purple steps) doesn't that add to the time complexity as well?

Top answer
1 of 3
1
Splitting the array doesn't meaningfully change the runtime. To be specific, "splitting the array" is actually the act of determining the midpoint and then making the two recursive calls. It takes only constant time to determine where the middle of the array is ((low + mid) / 2). This doesn't change mergesort's runtime. A simple proof is that every "split" of the array is accompanied by a corresponding "merge" operation (since if you split the array into two pieces, you'll have to merge it back). So as an accounting trick, you can just book the cost of the split into the corresponding merge ==> this effectively makes splitting the array free, while making the merge operation slightly more expensive. (However, the merge operation will still be O(n), which is why it doesn't affect anything.)
2 of 3
1
Big O notation does not care about the specific constant factor. This allows you to ignore a lot of implementation details when describing an algorithm and still be able to determine Big O complexity of it. As the other comment said, you can split an array by just change the starting and ending point, which would be constant. But even if you literally copy the entire array into 2 arrays (doing it the inefficient way), the Big O complexity is still O(nlog(n)). If you are still confused, this is a fully rigorous proof of time complexity of merge sort: The recurrence relation for Big O in merge sort is T(2k )<=2T(2k-1 )+C2k . That means to sort 2k items, you need to sort 2k-1 items twice, then add in an addition time for pre-processing and post-processing, which is at most constant multiple of 2k (that constant is C, and we don't care about specific value of C except that it's >=1, and by "constant" we mean it's independent from k). Set U(2k )=2U(2k-1 )+C2k and U(20 )=1. Then T(2k )<=U(2k ) so U is an upper bound of T. And you can compute U exactly: U(2k )=C2k +2U(2k-1 )=C2k +2C2k-1 +2U(2k -2 )=...=C2k +21 C2k-1 +...+2k-1 C21 +2k C20 +2k U(20 )=kC2k +2k so T(n)<=kC2k +2k ifn=2k . If n is not a power of 2, you can round up to the next power of 2 (so that 2k-1
🌐
Codecademy
codecademy.com › article › time-complexity-of-merge-sort
Time Complexity of Merge Sort: A Detailed Analysis | Codecademy
Explore the time complexity of Merge Sort in-depth, including best, average, and worst-case analysis, and comparison with other sorting algorithms.
🌐
W3Schools
w3schools.com › dsa › dsa_timecomplexity_mergesort.php
DSA Merge Sort Time Complexity
The number of splitting operations \((n-1)\) can be removed from the Big O calculation above because \( n \cdot \log_{2}n\) will dominate for large \( n\), and because of how we calculate time complexity for algorithms. The figure below shows how the time increases when running Merge Sort on an array with \(n\) values.
🌐
GeeksforGeeks
geeksforgeeks.org › dsa › merge-sort
Merge Sort - GeeksforGeeks
2T(n/2) represents time taken by the algorithm to recursively sort the two halves of the array. Since each half has n/2 elements, we have two recursive calls with input size as (n/2). O(n) represents the time taken to merge the two sorted halves
Published   October 3, 2025
🌐
Medium
medium.com › enjoy-algorithm › merge-sort-algorithm-design-and-analysis-f3ad2c2c8a9e
Merge Sort Algorithm | EnjoyAlgorithms
November 18, 2022 - Merge sort is an efficient sorting algorithm that works in O(nlogn) time complexity (both best and worst cases).
Find elsewhere
🌐
NVIDIA Developer
developer.nvidia.com › blog › merge-sort-explained-a-data-scientists-algorithm-guide
Merge Sort Explained: A Data Scientist’s Algorithm Guide | NVIDIA Technical Blog
June 12, 2023 - The merge sort notation for its average, best, and worst time complexity is log n * n * O(1). In Big O notation, low-order terms and constants are negligible, meaning the final notation for the merge sort algorithm is O(n log n).
🌐
Alma Better
almabetter.com › bytes › articles › merge-sort-time-complexity
What is the Time Complexity of Merge Sort Algorithm?
June 12, 2024 - This process is repeated recursively until the entire array is sorted. Merge Sort has an average and worst-case time complexity of O(n log n), making it a reliable choice for sorting large datasets.
🌐
Youcademy
youcademy.org › merge-sort-time-space-complexity
Time and Space Complexity of Merge Sort
Merge Sort has a time complexity of O(n log n) in all cases: best, average, and worst. This makes it highly efficient compared to algorithms like Bubble Sort (O(n²)) for large datasets.
🌐
W3Schools
w3schools.com › dsa › dsa_algo_mergesort.php
DSA Merge Sort
And the time complexity is pretty much the same for different kinds of arrays. The algorithm needs to split the array and merge it back together whether it is already sorted or completely shuffled.
🌐
Enjoy Algorithms
enjoyalgorithms.com › blog › merge-sort-algorithm
Merge Sort Algorithm
Merge sort is one of the fastest comparison-based sorting algorithms, which works on the idea of a divide and conquer approach. The worst and best-case time complexity of the merge sort is O(nlogn), and the space complexity is O(n). It is also one of the best algorithms for sorting linked lists ...
🌐
Medium
medium.com › outco › breaking-down-mergesort-924c3a55c969
Breaking Down MergeSort. And Understanding O(N log N) Time… | by Sergey Piterman | Outco | Medium
November 30, 2020 - Personally, I’ve come across ... I like about MergeSort is that it’s efficient. Its worst-case time complexity is O(N log N), which is as efficient as you can get for general-purpose sorting algorithms....
🌐
Vlabs
ds1-iiith.vlabs.ac.in › exp › merge-sort › analysis › time-and-space-complexity.html
Merge Sort
Total running time of merge sort is O(Nlog2N). While merging two arrays, we require an auxillary space to temporarily store the merged array, before we plug this partially sorted array into the main array. Hence space complexity of Merge Sort is O(N), as we require an auxillary array as big ...
🌐
DigitalOcean
digitalocean.com › community › tutorials › merge-sort-algorithm-java-c-python
Merge Sort Algorithm - Java, C, and Python Implementation | DigitalOcean
August 3, 2022 - The list of size N is divided into a max of Logn parts, and the merging of all sublists into a single list takes O(N) time, the worst-case run time of this algorithm is O(nLogn) Best Case Time Complexity: O(n*log n) Worst Case Time Complexity: O(n*log n) Average Time Complexity: O(n*log n) ...
🌐
Programiz PRO
programiz.pro › resources › dsa-merge-sort-complexity
Exploring time and space complexity of Merge sort
December 19, 2024 - Merge Sort is a comparison-based divide-and-conquer sorting algorithm that works by recursively dividing the array into halves, sorting each half, and then merging them back together. It consistently performs with a time complexity of O(n log n) in the best, worst, and average cases.
🌐
Baeldung
baeldung.com › home › algorithms › sorting › when will the worst case of merge sort occur?
When Will the Worst Case of Merge Sort Occur? | Baeldung on Computer Science
March 18, 2024 - Step 2 of the algorithm includes “Merge + Sort”, where two subarrays are merged so that a sorted array is created from each pair of subarrays. In the last step, the two halves of the original array are merged so that the complete array is sorted: This algorithm loops through times and the time complexity of every loop is , so the time complexity of the entire function is .
🌐
guvi.in
studytonight.com › data-structures › merge-sort
merge sort time complexity
Time complexity of Merge Sort is O(n*Log n) in all the 3 cases (worst, average and best) as merge sort always divides the array in two halves and takes linear time to merge two halves.