๐ŸŒ
GeeksforGeeks
geeksforgeeks.org โ€บ dsa โ€บ time-complexities-of-all-sorting-algorithms
Time Complexities of all Sorting Algorithms - GeeksforGeeks
September 23, 2016 - Time Complexity is defined as order of growth of time taken in terms of input size rather than the total time taken. It is because the total time taken also depends on some external factors like the compiler used, the processor's speed, etc.
Discussions

Is there a sorting algorithm with linear time complexity and O(1) auxiliary space complexity? - Stack Overflow
Is there a sorting algorithm with linear time complexity and O(1) auxiliary space complexity to sort a list of positive integers? I know that radix sort and counting sort have linear time complexit... More on stackoverflow.com
๐ŸŒ stackoverflow.com
Making a fast and stable sorting algorithm with O(1) memory
As has been pointed out elsewhere, it's actually O(log n) memory (it stores an index into the array, which requires at least log n bits). More on reddit.com
๐ŸŒ r/compsci
54
105
March 17, 2014
Big O Cheat Sheet: the time complexities of operations Python's data structures
Good for people getting into programming in general. I only have one remark: I wouldn't qualify O(n) as "Slow !" since it's still practically fast for low values of n and has the elegance of scaling linearly, which is one of the best scenarios available in the vast amount of cases a programmer will face. More on reddit.com
๐ŸŒ r/Python
28
209
April 16, 2024
What is the point of learning time complexity?
Let's say you write a program that does something, and you test it, and it works pretty well: it spits out the right answer pretty quickly. Would a company or university be able to use that same program's algorithm to process thousands of entries at a time? The program that took less than a second for you may take either a couple of seconds to calculate or several days to calculate, depending on the time complexity of the algorithm. Let's say you came up with an O(n) algorithm. If it took you about half a second to perform your calculation, it'll take approximately 5-10 minutes to perform a calculation with 1000 times as much data. If, unfortunately, your algorithm is an O(n^2) algorithm, it will take about a week to do a calculation with 1000 times as much data. That's why it's important to see if you can reduce the time complexity More on reddit.com
๐ŸŒ r/learnprogramming
116
503
October 1, 2022
People also ask

Which sorting algorithm has the best average-case time complexity?
Merge Sort, Quick Sort, and Heap Sort all have an average-case time complexity of O(n log n), making them efficient for general use.
๐ŸŒ
wscubetech.com
wscubetech.com โ€บ resources โ€บ dsa โ€บ time-space-complexity-sorting-algorithms
Time and Space Complexity of All Sorting Algorithms
What is the time complexity of Merge Sort?
Merge Sort has a time complexity of O(n log n) in all cases (best, worst, and average). It divides the input array into smaller subarrays and then merges them in sorted order.
๐ŸŒ
wscubetech.com
wscubetech.com โ€บ resources โ€บ dsa โ€บ time-space-complexity-sorting-algorithms
Time and Space Complexity of All Sorting Algorithms
What is the time complexity of Radix Sort?
Radix Sort has a time complexity of O(nk), where n is the number of elements and k is the number of digits or characters in the largest element.
๐ŸŒ
wscubetech.com
wscubetech.com โ€บ resources โ€บ dsa โ€บ time-space-complexity-sorting-algorithms
Time and Space Complexity of All Sorting Algorithms
๐ŸŒ
WsCube Tech
wscubetech.com โ€บ resources โ€บ dsa โ€บ time-space-complexity-sorting-algorithms
Time and Space Complexity of All Sorting Algorithms
November 26, 2025 - Learn the time and space complexity of all sorting algorithms, including quicksort, mergesort, heapsort, and more, in this step-by-step tutorial.
๐ŸŒ
Medium
medium.com โ€บ @sylvain.tiset โ€บ a-journey-to-master-sorting-algorithms-6df575564699
A journey to master Sorting Algorithms | by Sylvain Tiset | Medium
October 31, 2024 - Time Complexity: itโ€™s a measure of how long it takes to run an algorithm, is used to categorize sorting algorithms. There can be different values for the best-case, average case and worst case scenario. Space Complexity: itโ€™s the amount of memory required to execute the algorithm.
๐ŸŒ
LeetCode
leetcode.com โ€บ discuss โ€บ post โ€บ 3191876 โ€บ Understanding-Time-and-Space-Complexity-of-Sorting-Algorithms
Understanding Time and Space Complexity of Sorting Algorithms - Discuss - LeetCode
Quick sort has a space complexity of O(log n) because it uses recursion and requires additional memory for the call stack. Understanding the time and space complexity of sorting algorithms can help you choose the right algorithm for your specific ...
๐ŸŒ
Interview Kickstart
interviewkickstart.com โ€บ home โ€บ blogs โ€บ learn โ€บ time and space complexities of sorting algorithms explained
Time and Space Complexities of Sorting Algorithms Explained
December 22, 2024 - Thus the total number of comparisons sum up to n * (n โ€“ 1) / 2. The number of swaps performed is at most n โ€“ 1. So the overall time complexity is quadratic. Since we are not using any extra data structure apart from the input array, the ...
๐ŸŒ
Wikipedia
en.wikipedia.org โ€บ wiki โ€บ Sorting_algorithm
Sorting algorithm - Wikipedia
1 week ago - Practical general sorting algorithms are almost always based on an algorithm with average time complexity (and generally worst-case complexity) O(n log n), of which the most common are heapsort, merge sort, and quicksort.
Find elsewhere
๐ŸŒ
Big-O Cheat Sheet
bigocheatsheet.com
Big-O Algorithm Complexity Cheat Sheet (Know Thy Complexities!) @ericdrowell
This webpage covers the space and time Big-O complexities of common algorithms used in Computer Science. When preparing for technical interviews in the past, I found myself spending hours crawling the internet putting together the best, average, and worst case complexities for search and sorting ...
๐ŸŒ
Hero Vired
herovired.com โ€บ learning-hub โ€บ topics โ€บ time-and-space-complexity-in-sorting-algorithms
Time and Space Complexity of Sorting Algorithms : Hero Vired
Algorithms like merge sort and heap sort are effective for larger data sets due to their O(n log n) time complexity, while simpler options like Bubble Sort are better for smaller sets.
๐ŸŒ
Cornell
courses.cis.cornell.edu โ€บ courses โ€บ cs2110 โ€บ 2025fa โ€บ lectures โ€บ lec07
7. Sorting Algorithms | CS 2110
It is unstable and not adaptive, and it has an \(O(N^2)\) worst-case time complexity and an \(O(N)\) space complexity. Quicksort tends to perform well in practice and has an \(O(N \log N)\) expected runtime complexity. Using a good pivot-selection strategy improves its performance. ... One property that one might like in a sorting algorithm is this: Let \( v1 \) and \( v2 \) be two values that are equal.
๐ŸŒ
Built In
builtin.com โ€บ machine-learning โ€บ fastest-sorting-algorithm
Sorting Algorithms: Slowest to Fastest | Built In
Hence the time complexity will be O(n). Since we are not making use of any extra data structure apart from the input array, the space complexity will be O(1). The Quicksort algorithm is faster than the previous algorithms because this algorithm ...
๐ŸŒ
FasterCapital
fastercapital.com โ€บ topics โ€บ explanation-of-time-and-space-complexity-in-sorting-algorithms.html โ€บ 1
Explanation Of Time And Space Complexity In Sorting Algorithms - FasterCapital
Time complexity refers to the amount of time it takes for an algorithm to complete, while space complexity refers to the amount of memory an algorithm uses. ... The time complexity of a sorting algorithm is usually measured in terms of the number ...
๐ŸŒ
IJERT
ijert.org โ€บ analysis-of-sorting-algorithms-using-time-complexity
Analysis of Sorting Algorithms Using Time Complexity โ€“ IJERT
April 24, 2018 - The efficiency or performance of an algorithm depends on the time and space complexity of the algorithm. The space complexity of an algorithm is the amount of memory it needs to run to completion.
๐ŸŒ
Medium
ibrahimcanerdogan.medium.com โ€บ time-space-complexity-in-sorting-algorithms-software-interview-1857f591b0cd
Time & Space Complexity in Sorting Algorithms | Software Interview | by ibrahimcanerdogan | Medium
February 26, 2024 - It has been shown that quicksort is more complex in implementation but returns overall quicker solutions. Selection sort is more simplistic and less code-heavy and requires less space, but will not generate results as effectively.
๐ŸŒ
HackerEarth
hackerearth.com โ€บ practice โ€บ notes โ€บ sorting-and-searching-algorithms-time-complexities-cheat-sheet
Sorting And Searching Algorithms - Time Complexities Cheat Sheet - Vipin Khushu
**Time complexity Cheat Sheet** ![Image Loading.....Time Complexity Cheat Sheet][1] **BigO Graph** ![Image Loading.....Graph of Time Complexity][2] *Correction:- Best time complexity for TIM SORT is O(nlogn) [1]: https://he-s3.s3.amazonaws.com/media/uploads/c950295.png [2]: https://he-s3.s3.amazonaws.com/media/uploads/317c55e.png HackerEarth is a global hub of 5M+ developers.
Top answer
1 of 4
5

If we are sorting only integers, we can use the in-situ variant of counting sort which has O(k) space complexity, which is independent from the variable n. In other words when we treat k as constant, the space complexity is O(1).

Alternatively, we can use in place radix sort with lg k phases of binary partition with O(lg k) space complexity (due to recursion). Or even less phases with the use of counting sort to determine the buckets boundaries for the n-way partition. These solutions sport time complexity of O(lg k * n), which when expressed only in terms of the variable n is O(n) (when k is considered constant).

Another possible approach to obtain O(n) step complexity and O(1) space complexity, when k is considered constant, is to use something which can be called subtraction sort, as described by the OP in their own answer, or elsewhere. It has step complexity O(sum(input)) which is better than O(kn) (and for certain specific inputs it is even better than binary-radix sort's O(lg k * n), e.g. for all inputs of the form [k, 0, 0, ... 0]) and space complexity O(1).

Yet another solution is to use bingo sort which has step complexity O(vn) where v <= k is the number of unique values in the input, and space complexity O(1).

Note that neither of these sorting solutions are stable, which matters if we sort something more than just integers (some arbitrary objects with integer keys).

There is also a cutting edge stable partition algorithm described in this paper with O(1) space complexity. Combining it with radix sort, one may construct a stable linear sort algorithm with constant space - O(lg k * n) step complexity and O(1) space complexity.


EDIT:

As per the request from the comment, I've tried to find a source for the "in-situ" variant of counting sort, but haven't found anything of good quality I could link to (it's really strange that there is no easily available description for such a basic algorithm). Therefore, I'm posting the algorithm here:

The regular counting sort (from Wikipedia)

count = array of k+1 zeros
for x in input do
    count[key(x)] += 1

total = 0
for i in 0, 1, ... k do
    count[i], total = total, count[i] + total

output = array of the same length as input
for x in input do
    output[count[key(x)]] = x
    count[key(x)] += 1 

return output

It assumes that the input consists of some objects which can be identified by an integer key in the range 0 to k - 1. It uses O(n + k) extra space.

The trivial in-situ variant for integers

This variant requires the input to be pure integers, not arbitrary objects with integer keys. It simply reconstructs the input array from the count array.

count = array of k zeros
for x in input do
    count[x] += 1

i = 0
for x in 0, 1, ... k - 1 do
    for j in 1, 2, ... count[x] do
        input[i], i = x, i + 1

return input

It uses O(k) extra space.

The complete in-situ variant for arbitrary objects with integer keys

This variant accepts arbitrary objects similarly to the regular variant. It uses swaps to place objects in appropriate places. After computing the count array in the two first loops it leaves it immutable, and uses another array called done to keep track of how many objects with a given key have been already placed in the right position.

count = array of k+1 zeros
for x in input do
    count[key(x)] += 1

total = 0
for i in 0, 1, ... k do
    count[i], total = total, count[i] + total

done = array of k zeros
for i in 0, 1, ... k - 1 do
    current = count[i] + done[i]
    while done[i] < count[i + 1] - count[i] do
        x = input[current]
        destination = count[key(x)] + done[key(x)]
        if destination = current then
            current += 1
        else
            swap(input[current], input[destination])
        done[key(x)] += 1 

return input

This variant is not stable, so it cannot be used as a subroutine in radix sort. It uses O(2k) = O(k) extra space.

2 of 4
1

I wanted to include an algorithm here which is an improvement of Mathphile's first answer. In that case the idea was to shave 1 off of each number in the unsorted suffix of the input (while swapping sorted numbers into the prefix). Whenever a number in the unsorted suffix hits 0 it means it is smaller than any other number in the unsorted suffix (because all numbers are being reduced at the same rate).

There is a major improvement possible: with no change to time complexity we can subtract numbers much bigger than 1 - in fact we can subtract a number equal to the smallest remaining unsorted item. This allows this sort to function well regardless of the numeric sizes of the array items, and on floating point values! A javascript implementation:

let subtractSort = arr => {
  
  let sortedLen = 0;
  let lastMin = 0; // Could also be `Math.min(...arr)`
  let total = 0;
  while (sortedLen < arr.length) {
    
    let min = arr[sortedLen];
    for (let i = sortedLen; i < arr.length; i++) {
      
      if (arr[i]) {
        
        arr[i] -= lastMin;
        if (arr[i]) min = Math.min(min, arr[i]);
        
      } else {
        
        arr[i] = arr[sortedLen];
        arr[sortedLen] = total;
        sortedLen++;
        
      }
      
    }
    
    total += lastMin;
    lastMin = min;
    
  }
  return arr;
  
};

let examples = [
  [ 3, 2, 5, 4, 8, 5, 7, 1 ],
  [ 3000, 2000, 5000, 4000, 8000, 5000, 7000, 1000 ],
  [ 0.3, 0.2, 0.5, 0.4, 0.8, 0.5, 0.7, 0.1 ],
  [ 26573726573, 678687, 3490, 465684586 ]
];
for (let example of examples) {
  console.log(`Unsorted: ${example.join(', ')}`);
  console.log(`Sorted:   ${subtractSort(example).join(', ')}`);
  console.log('');
}

Note this sort only works with positive numbers. To work with negative numbers we would need to find the most negative item, subtract this negative value from every item in the array, sort the array, and finally add the most negative value back to every item - overall this doesn't increase time complexity.

๐ŸŒ
Board Infinity
boardinfinity.com โ€บ blog โ€บ time-complexity-of-sorting-algorithms
Time Complexity of Sorting Algorithms | Board Infinity
January 3, 2025 - ... Time Complexity Analysis: The worst case, average case, and best case time complexity of sorting algorithm Selection Sort is O(n2) because it always needs to look over the unsorted section of the array even if it is sorted initially.
๐ŸŒ
Programiz
programiz.com โ€บ dsa โ€บ sorting-algorithm
Sorting Algorithm
1. Time Complexity: Time complexity refers to the time taken by an algorithm to complete its execution with respect to the size of the input. It can be represented in different forms: ... 2. Space Complexity: Space complexity refers to the total amount of memory used by the algorithm for a ...
๐ŸŒ
Uncodemy
uncodemy.com โ€บ blog โ€บ time-complexity-of-all-sorting-explained-clearly
Time Complexity of Sorting Algorithms: A Comprehensive Guide
Is there any sorting algorithm with linear time complexity? Yes, Counting Sort, Bucket Sort, and Radix Sort can achieve linear time under certain conditions. Q5. What does it mean for a sorting algorithm to be stable? A stable sort maintains the relative order of equal elements. Q6. Which sorting algorithm uses the least memory? Heap Sort uses only O(1) auxiliary space.
๐ŸŒ
W3Schools
w3schools.com โ€บ dsa โ€บ dsa_timecomplexity_theory.php
DSA Time Complexity
When talking about "operations" here, "one operation" might take one or several CPU cycles, and it really is just a word helping us to abstract, so that we can understand what time complexity is, and so that we can find the time complexity for different algorithms. One operation in an algorithm can be understood as something we do in each iteration of the algorithm, or for each piece of data, that takes constant time. For example: Comparing two array elements, and swapping them if one is bigger than the other, like the Bubble sort algorithm does, can be understood as one operation.