data structures - Sorting Algorithms & Time Complexity - Stack Overflow
I have multiple questions about the complexity in time of different sorting algorithms
What time complexity should one assume when using built in sort functions?
Big O Cheat Sheet: the time complexities of operations Python's data structures
Videos
I was asking myself this question a while ago, and I decided to go ahead and write some code to figure that out. The chart is displaying number of inputs on the x axis and time on the y axis.
As you can see from the image, RadixSort is generally the fastest, followed by QuickSort. Their time complexities are:
- RadixSort: O(N*W), where N is the number of elements to sort and W is the number of bits required to store each key.
- QuickSort: O(N*logN), where N is the number of elements to sort.
Anyway, RadixSort speed comes at a cost. In fact, the space complexities of the two algorithms are the following:
- RadixSort: O(N+W), where N is the number of elements to sort and W is the number of bits required to store each key.
- QuickSort: O(logN), or O(N) depending on how the pivots are chosen: https://cs.stackexchange.com/questions/138335/what-is-the-space-complexity-of-quicksort.
Algorithm Time Complexities Best Average Worst
Selection Sort Ω(n^2) θ(n^2) O(n^2)
Bubble Sort Ω(n) θ(n^2) O(n^2)
Insertion Sort Ω(n) θ(n^2) O(n^2)
Heap Sort Ω(n log(n)) θ(n log(n)) O(n log(n))
Quick Sort Ω(n log(n)) θ(n log(n)) O(n^2)
Merge Sort Ω(n log(n)) θ(n log(n)) O(n log(n))
Bucket Sort Ω(n+k) θ(n+k) O(n^2)
Radix Sort Ω(nk) θ(nk) O(nk)
The time complexity of Quicksort is O(n log n) in the best case, O(n log n) in the average case, and O(n^2) in the worst case. But because it has the best performance in the average case for most inputs, Quicksort is generally considered the “fastest” sorting algorithm.
