sorting - Estimating runtime from time complexity of sort algorithms - Stack Overflow
What time complexity should one assume when using built in sort functions?
data structures - Sorting Algorithms & Time Complexity - Stack Overflow
Time complexity of sorting an array and then searching
Which sorting algorithm has the best average-case time complexity?
What is the fastest sorting algorithm?
What is the best sorting algorithm for random data?
Videos
Let's say I want to find a number in an unordered array in O(log n) time. Implementing binary search is already O(log n) time but because I have to sort the array first do we take that into account in the time complexity calculations?
The problem gets worse if you're using a built in sort that you don't know the time complexity, for example, if you were doing this in JavaScript [5,4,12,3,56,1, ...].sort() what time complexity would you attribute to it?
I was asking myself this question a while ago, and I decided to go ahead and write some code to figure that out. The chart is displaying number of inputs on the x axis and time on the y axis.
As you can see from the image, RadixSort is generally the fastest, followed by QuickSort. Their time complexities are:
- RadixSort: O(N*W), where N is the number of elements to sort and W is the number of bits required to store each key.
- QuickSort: O(N*logN), where N is the number of elements to sort.
Anyway, RadixSort speed comes at a cost. In fact, the space complexities of the two algorithms are the following:
- RadixSort: O(N+W), where N is the number of elements to sort and W is the number of bits required to store each key.
- QuickSort: O(logN), or O(N) depending on how the pivots are chosen: https://cs.stackexchange.com/questions/138335/what-is-the-space-complexity-of-quicksort.
Algorithm Time Complexities Best Average Worst
Selection Sort Ω(n^2) θ(n^2) O(n^2)
Bubble Sort Ω(n) θ(n^2) O(n^2)
Insertion Sort Ω(n) θ(n^2) O(n^2)
Heap Sort Ω(n log(n)) θ(n log(n)) O(n log(n))
Quick Sort Ω(n log(n)) θ(n log(n)) O(n^2)
Merge Sort Ω(n log(n)) θ(n log(n)) O(n log(n))
Bucket Sort Ω(n+k) θ(n+k) O(n^2)
Radix Sort Ω(nk) θ(nk) O(nk)
The time complexity of Quicksort is O(n log n) in the best case, O(n log n) in the average case, and O(n^2) in the worst case. But because it has the best performance in the average case for most inputs, Quicksort is generally considered the “fastest” sorting algorithm.
I have a function that takes an array, then sorts it and iterates over every item.
I’m trying to understand why the Time complexity is O(nlogn) instead of O(n+logn) or O(n). Why is n being multiplied together with logn when they are two separate steps?
My thought process is that if it takes 10ms (logn) to sort the array and then 100ms (n) to search it, then the overall time is 110ms (n+logn) not 1000ms (nlogn). And because 100ms is much greater than 10ms, I can simplify it to just 100ms or O(n)
