🌐
Reddit
reddit.com › r/learnprogramming › understanding merge sort big o complexity
r/learnprogramming on Reddit: Understanding merge sort Big O complexity
June 26, 2022 -

I'm going to be referring to this image. So the Big O of merge sort is nlogn.

So in the example I posted above, n = 8 so the Big O should be 8log(8) = 16. I think it's because in the first green level, we go through 8 items then merge and we do the same thing for the second green level so 8+8 = 16. But then I thought when we split the initial array(the purple steps) doesn't that add to the time complexity as well?

Top answer
1 of 3
1
Splitting the array doesn't meaningfully change the runtime. To be specific, "splitting the array" is actually the act of determining the midpoint and then making the two recursive calls. It takes only constant time to determine where the middle of the array is ((low + mid) / 2). This doesn't change mergesort's runtime. A simple proof is that every "split" of the array is accompanied by a corresponding "merge" operation (since if you split the array into two pieces, you'll have to merge it back). So as an accounting trick, you can just book the cost of the split into the corresponding merge ==> this effectively makes splitting the array free, while making the merge operation slightly more expensive. (However, the merge operation will still be O(n), which is why it doesn't affect anything.)
2 of 3
1
Big O notation does not care about the specific constant factor. This allows you to ignore a lot of implementation details when describing an algorithm and still be able to determine Big O complexity of it. As the other comment said, you can split an array by just change the starting and ending point, which would be constant. But even if you literally copy the entire array into 2 arrays (doing it the inefficient way), the Big O complexity is still O(nlog(n)). If you are still confused, this is a fully rigorous proof of time complexity of merge sort: The recurrence relation for Big O in merge sort is T(2k )<=2T(2k-1 )+C2k . That means to sort 2k items, you need to sort 2k-1 items twice, then add in an addition time for pre-processing and post-processing, which is at most constant multiple of 2k (that constant is C, and we don't care about specific value of C except that it's >=1, and by "constant" we mean it's independent from k). Set U(2k )=2U(2k-1 )+C2k and U(20 )=1. Then T(2k )<=U(2k ) so U is an upper bound of T. And you can compute U exactly: U(2k )=C2k +2U(2k-1 )=C2k +2C2k-1 +2U(2k -2 )=...=C2k +21 C2k-1 +...+2k-1 C21 +2k C20 +2k U(20 )=kC2k +2k so T(n)<=kC2k +2k ifn=2k . If n is not a power of 2, you can round up to the next power of 2 (so that 2k-1
Discussions

Understanding merge sort Big O complexity
Splitting the array doesn't meaningfully change the runtime. To be specific, "splitting the array" is actually the act of determining the midpoint and then making the two recursive calls. It takes only constant time to determine where the middle of the array is ((low + mid) / 2). This doesn't change mergesort's runtime. A simple proof is that every "split" of the array is accompanied by a corresponding "merge" operation (since if you split the array into two pieces, you'll have to merge it back). So as an accounting trick, you can just book the cost of the split into the corresponding merge ==> this effectively makes splitting the array free, while making the merge operation slightly more expensive. (However, the merge operation will still be O(n), which is why it doesn't affect anything.) More on reddit.com
🌐 r/learnprogramming
3
4
June 26, 2022
Trying to understand the time complexity of merge sort issue
We merge two lists of length N/2, which takes time N. In order to do that, we had to merge 4 lists of length N/4, which takes time N. In order to do that, we had to merge 8 lists of length N/8, which takes time N. At the very beginning we merged N lists of length 1. So each time we're doubling the number of lists, until we get to N. The number of times you have to double 1 until you get to N is log_2 (N). So that's where the log_2 comes from. Maybe that helps? More on reddit.com
🌐 r/algorithms
6
3
August 25, 2020
My Merge Sort gives me a stack overflow
You're missing parentheses. Try `let m = (l + r) / 2`. More on reddit.com
🌐 r/rust
12
4
October 25, 2020
Merge sort vs Quicksort Applications
Hey Ronav, Great post, I agree with a lot of your thoughts. I wanted to add on to this post and discuss insertion sort briefly -- even though it's not ideal (it has an average time complexity of O(n^2), and best case time complexity of O(n)), insertion sort is very stable and preserves the relative order of elements. Also, if you have an array that is mostly sorted, insertion sort is quite efficient and we don't need to wait for all the data to come in before sorting it. However, on average, when it comes to larger or more complex datasets, quicksort and merge sort are certainly better. More on reddit.com
🌐 r/cs2c
3
5
April 3, 2024
🌐
Codecademy
codecademy.com › article › time-complexity-of-merge-sort
Time Complexity of Merge Sort: A Detailed Analysis | Codecademy
Explore the time complexity of Merge Sort in-depth, including best, average, and worst-case analysis, and comparison with other sorting algorithms.
worst-case optimal stable divide and conquer comparison sorting algorithm
Merge-sort-example-300px.gif
In computer science, merge sort (also commonly spelled as mergesort or merge-sort) is an efficient and general purpose comparison-based sorting algorithm. Most implementations of merge sort are stable, which means that the … Wikipedia
Factsheet
Data structure Array
Worst-case performance
Factsheet
Data structure Array
Worst-case performance
🌐
Wikipedia
en.wikipedia.org › wiki › Merge_sort
Merge sort - Wikipedia
2 weeks ago - In the worst case, merge sort uses approximately 39% fewer comparisons than quicksort does in its average case, and in terms of moves, merge sort's worst case complexity is O(n log n) - the same complexity as quicksort's best case.
🌐
DigitalOcean
digitalocean.com › community › tutorials › merge-sort-algorithm-java-c-python
Merge Sort Algorithm - Java, C, and Python Implementation | DigitalOcean
August 3, 2022 - The list of size N is divided into a max of Logn parts, and the merging of all sublists into a single list takes O(N) time, the worst-case run time of this algorithm is O(nLogn) Best Case Time Complexity: O(n*log n) Worst Case Time Complexity: O(n*log n) Average Time Complexity: O(n*log n) ...
Find elsewhere
🌐
Alma Better
almabetter.com › bytes › articles › merge-sort-time-complexity
What is the Time Complexity of Merge Sort Algorithm?
June 12, 2024 - This process is repeated recursively until the entire array is sorted. Merge Sort has an average and worst-case time complexity of O(n log n), making it a reliable choice for sorting large datasets.
🌐
Youcademy
youcademy.org › merge-sort-time-space-complexity
Time and Space Complexity of Merge Sort
Merge Sort has a time complexity of O(n log n) in all cases: best, average, and worst. This makes it highly efficient compared to algorithms like Bubble Sort (O(n²)) for large datasets.
🌐
Quora
quora.com › How-is-the-time-complexity-of-merge-function-in-merge-sort-is-O-n
How is the time complexity of merge function in merge sort is [math]O(n)[/math]? - Quora
Answer (1 of 4): For every element that the MERGE function algorithm is outputting in the output array / list from the two sorted arrays / lists at most 1 comparison is made and the output list pointer never backtracks. (Of course either of the 2 sorted list may get exhausted and in that case a s...
🌐
Sarthaks eConnect
sarthaks.com › 3587537 › what-is-the-time-complexity-of-merge-sort
What is the time complexity of Merge Sort? - Sarthaks eConnect | Largest Online Education Community
January 22, 2024 - LIVE Course for free · The time complexity of Merge Sort is O(n log n), where "n" is the number of elements in the array. Merge Sort is an efficient, comparison-based sorting algorithm that follows the divide-and-conquer paradigm
🌐
Medium
medium.com › enjoy-algorithm › merge-sort-algorithm-design-and-analysis-f3ad2c2c8a9e
Merge Sort Algorithm | EnjoyAlgorithms
November 18, 2022 - Merge sort is an efficient sorting algorithm that works in O(nlogn) time complexity (both best and worst cases).
🌐
NVIDIA Developer
developer.nvidia.com › blog › merge-sort-explained-a-data-scientists-algorithm-guide
Merge Sort Explained: A Data Scientist’s Algorithm Guide | NVIDIA Technical Blog
June 12, 2023 - The algorithm works by dividing ... to form the final sorted list. The time complexity of the merge sort algorithm remains O(n log n) for best, worst, and average scenarios, ......
🌐
Scribd
scribd.com › document › 717742976 › 9629-aoaexp4
Merge Sort Time Complexity Analysis | PDF
JavaScript is disabled in your browser · Please enable JavaScript to proceed · A required part of this site couldn’t load. This may be due to a browser extension, network issues, or browser settings. Please check your connection, disable any ad blockers, or try using a different browser
🌐
Medium
tarunjain07.medium.com › merge-sort-complexity-analysis-notes-b48426aa8d53
Merge sort Complexity analysis — [Notes] | by Tarun Jain | Medium
July 23, 2023 - best choice for sorting linked list due to it’s efficiency and low memory consumption → O(nlogn) time and O(1) extra space. → Other algos like quicksort or heapsort don’t perform well with linkedlist due to slow random access of linked ...
🌐
WsCube Tech
wscubetech.com › resources › dsa › merge-sort
Merge Sort: Algorithm, Complexity, Examples (C, Python, More)
February 14, 2026 - Learn about Merge Sort, its Algorithm, Example, Complexity in this tutorial. Understand how this efficient sorting technique works in various languages.
🌐
Medium
medium.com › outco › breaking-down-mergesort-924c3a55c969
Breaking Down MergeSort. And Understanding O(N log N) Time… | by Sergey Piterman | Outco | Medium
November 30, 2020 - Personally, I’ve come across ... I like about MergeSort is that it’s efficient. Its worst-case time complexity is O(N log N), which is as efficient as you can get for general-purpose sorting algorithms....
🌐
Baeldung
baeldung.com › home › algorithms › sorting › when will the worst case of merge sort occur?
When Will the Worst Case of Merge Sort Occur? | Baeldung on Computer Science
March 18, 2024 - Step 2 of the algorithm includes “Merge + Sort”, where two subarrays are merged so that a sorted array is created from each pair of subarrays. In the last step, the two halves of the original array are merged so that the complete array is sorted: This algorithm loops through times and the time complexity of every loop is , so the time complexity of the entire function is .
🌐
Vlabs
ds1-iiith.vlabs.ac.in › exp › merge-sort › analysis › time-and-space-complexity.html
Merge Sort
Total running time of merge sort is O(Nlog2N). While merging two arrays, we require an auxillary space to temporarily store the merged array, before we plug this partially sorted array into the main array. Hence space complexity of Merge Sort is O(N), as we require an auxillary array as big ...