Here is quicksort in a nutshell:

  • Choose a pivot somehow.
  • Partition the array into two parts (smaller than the pivot, larger than the pivot).
  • Recursively sort the first part, then recursively sort the second part.

Each recursive call uses words in local variables, hence the total space complexity is proportional to the height of the recursion tree.

The height of the recursion tree is always at least , hence this is a lower bound on the space complexity. If you choose the pivot at random or using a good heuristic, then the recursion tree will have height , and so the space complexity is . If the pivot can be chosen adversarially, you can cause the recursion tree to have height , causing the worst-case space complexity to be .

Answer from Yuval Filmus on Stack Exchange
🌐
GeeksforGeeks
geeksforgeeks.org › dsa › time-and-space-complexity-analysis-of-quick-sort
Time and Space Complexity Analysis of Quick Sort - GeeksforGeeks
July 23, 2025 - The space complexity of Quick Sort in the best case is O(log n), while in the worst-case scenario, it becomes O(n) due to unbalanced partitioning causing a skewed recursion tree that requires a call stack of size O(n).
🌐
Programiz PRO
programiz.pro › resources › dsa-quick-sort-complexity
Exploring the Time and Space Complexities of Quick Sort
In the average case, quick sort performs well with O(n logn) time complexity. Assuming that the pivot divides the array into roughly equal parts, each partitioning step takes O(n) time, and the recursion depth is O(logn). Comparing the Number of Operations Required for Different Input Sizes ...
Discussions

algorithm - Is it possible to implement quicksort with O(1) space complexity? - Stack Overflow
I'm curious as to whether it's possible to implement quicksort non-recursively and, in doing so, implement it with constant space complexity. ... yes, you just have to pick the pivot elements in such a way that you are able to predict the partition sizes. ... @Daniel: It's been proven that you can't do a sort ... More on stackoverflow.com
🌐 stackoverflow.com
Why is the worst case space complexity of Quicksort O(n)?
Quicksort is a recursive algorithm in its nature. The space complexity is not coming from needing an additional array (the array can get partitioned in-place), but from the call stack of each recursive function call. In the worst case, that is, the array is already sorted in reverse order, the intuitive implementation of quicksort will call itself n times, thereby needing an O(n) large call stack. Note that there is a method that can achieve O(log n) space complexity even in the worst case, but the reason for the space complexity is the same -- it comes from the call stack, and that method uses a clever trick to limit the number of recursion calls. More on reddit.com
🌐 r/ECE
8
6
January 27, 2024
Time Complexities of all Sorting Algorithms
Your All-in-One Learning Portal. It contains well written, well thought and well explained computer science and programming articles, quizzes and practice/competitive programming/company interview Questions. More on geeksforgeeks.org
🌐 geeksforgeeks.org
1
September 23, 2016
When would you ever want bubblesort?
When you're taking an exam and the question says "implement bubble sort." More on reddit.com
🌐 r/programming
113
229
December 4, 2023
People also ask

What is the time complexity of Radix Sort?
Radix Sort has a time complexity of O(nk), where n is the number of elements and k is the number of digits or characters in the largest element.
🌐
wscubetech.com
wscubetech.com › resources › dsa › time-space-complexity-sorting-algorithms
Time and Space Complexity of All Sorting Algorithms
What is the time complexity of Merge Sort?
Merge Sort has a time complexity of O(n log n) in all cases (best, worst, and average). It divides the input array into smaller subarrays and then merges them in sorted order.
🌐
wscubetech.com
wscubetech.com › resources › dsa › time-space-complexity-sorting-algorithms
Time and Space Complexity of All Sorting Algorithms
What is the time complexity of Bucket Sort?
The time complexity of Bucket Sort is O(n + k) in the best and average cases, where k is the number of buckets and n is the number of elements. However, in the worst case (when all elements fall into the same bucket), the time complexity is O(n²).
🌐
wscubetech.com
wscubetech.com › resources › dsa › time-space-complexity-sorting-algorithms
Time and Space Complexity of All Sorting Algorithms
Top answer
1 of 3
20

Wikipedia is not always wrong. And, as the section suggests, there is a way to do the quicksort, or something similar, using constant space. One important point. Quicksort itself could be defined as a recursive partitioning algorithm. If so, then by definition it will require O(n) stack space. However, I'm assuming that you are not using such a pedantic definition.

Just a quick review of how the partitioning works. Given an array, a starting point and an ending point, a partition value is chosen. The data elements in the array are then split so everything less than the partition value is on the left and everything greater is on the right. A good way of doing this is by starting at each end, finding the first value that doesn't belong, and swapping them. This, by the way, uses constant space.

So, each step of the algorithm is going through the array. Let's remember this fact.

Now, we can make an interesting observation. If we do the recursive partitioning in a depth-first fashion, then we only have to store the end points of each range. On the way down, the left edge of the array is always the beginning. The end point gets successively close to the beginning, until there are just two elements that can be swapped, or not. At this point, the beginning moves over two slots, but we don't know the end. So, look up the end and continue the process. Then at the next step "up", we need the next end point, and so on.

The question is: Can we find the end by some means other than storing the actual value in a stack?

Well, the answer is "yes".

Each step in the recursive partitioning algorithm reads through all the data. We can do some additional calculations on the data. In particular, we can calculate the largest value and the second largest value. (I would also calculate the smallest value as well, but that is an optimization.).

What we do with the values is mark the ranges. On the first split, this means putting the second largest value at the split point and the largest value at the end of the range. On the way back up the tree, you know where the range starts. The end of the range is the first value larger than that value.

Voila! You can move up the "recursion" tree without storing any data. You are just using the data as presented.

Once you have accomplished this, you simply need to change the algorithm from a recursive algorithm to a while loop. The while loop rearranges the data, setting a starting point and stopping point at each step. It chooses a splitter, splits the data, marks the starting and ending point, and then repeats on the left side of the data.

When it has gotten down to the smallest unit, it then check whether it is done (has it reached the end of the data). If not, it looks at the data point one unit over to find the first marker. It then goes through the data to look for the end point. This search, by the way, is equivalent in complexity to the partitioning of the data, so it does not add to the order of complexity. It then iterates through this array, continuing the process until it is done.

If you have duplicates in the data, the process is slightly more complex. However, if there are log(N) duplicates, I would almost argue for removing the duplicates, sorting the data using the remaining slots as a stack, and then incorporating them back in.

Why this is quicksort. The quicksort algorithm is a partition exchange algorithm. The algorithm proceeds by choosing a splitter value, partitioning the data on the two sides, and repeating this process. Recursion is not necessary, as Jeffrey points out in his answer. It is a great convenience.

This algorithm proceeds in exactly the same way. The partitioning follows the same underlying rule, with smaller records on the left and larger records on the right. The only difference is that within each partition, particular values are chosen to be on the edges of the partition. By careful placement of these values, no additional "per-step" storage is needed. Since these values belong in the partition, this is a valid partition according to the quicksort principal of partition-and-repeat.

If one argues that a quicksort must use recursion, then this would fail that strict test (and the answer to the original question is trivial).

2 of 3
4

It's entirely possible to implement it non-recursively, but you do that by implementing a stack separate from the normal function call/return stack. It may save some space by only storing the essential information instead of a lot of (mostly identical) function return addresses, but its size is still going to be logarithmic, not constant.

Quicksort Definition

Since there's been discussion about whether (for example) the algorithm cited by @Gordon Linoff in his answer is really a Quicksort, I'll refer to C.A.R. Hoare's paper describing Quicksort, which seems to me the most authoritative source available about what does or does not constitute a Quicksort. According to his paper:

Meanwhile, the addresses of the first and last items of the postponed segment must be stored.

While it's not too much of a stretch to store something that's (more or less) equivalent to an address (e.g., an index) rather than an actual index, it seems to me that when the description of the algorithm directly states that you must store an address, an algorithm that does not store an address or anything even roughly equivalent to it, is no longer an implementation of that same algorithm.

Reference

https://academic.oup.com/comjnl/article/5/1/10/395338

🌐
Wikipedia
en.wikipedia.org › wiki › Quicksort
Quicksort - Wikipedia
1 day ago - However, without Sedgewick's trick ... complexity viewpoint, variables such as lo and hi do not use constant space; it takes O(log n) bits to index into a list of n items....
Find elsewhere
🌐
Medium
ibrahimcanerdogan.medium.com › time-space-complexity-in-sorting-algorithms-software-interview-1857f591b0cd
Time & Space Complexity in Sorting Algorithms | Software Interview | by ibrahimcanerdogan | Medium
February 26, 2024 - Therefore, space complexity = O(1). When evaluating time complexity, a good rule of thumb is to consider what will happen if the list is doubled. Naturally, the inner and outer loops will have to increase by no iterations to match the additional ...
🌐
WsCube Tech
wscubetech.com › resources › dsa › time-space-complexity-sorting-algorithms
Time and Space Complexity of All Sorting Algorithms
November 26, 2025 - Learn the time and space complexity of all sorting algorithms, including quicksort, mergesort, heapsort, and more, in this step-by-step tutorial.
🌐
Rptu
ml.cs.rptu.de › lectures › 2016_AlgDat › 08_mergequick.pdf pdf
Algorithms and Data Structures Marius Kloft Sorting: Merge Sort and Quick Sort
• Quick Sort · – Algorithm · – Average Case Analysis · – Improving Space Complexity · Marius Kloft: Alg&DS, Summer Semester 2016 · 15 · Comparison Merge Sort and Quick Sort · • What can we do better than Merge Sort? – The O(n) additional space is a problem ·
🌐
Rose-Hulman Institute of Technology
rose-hulman.edu › class › cs › csse230 › 201810 › Slides › 27-Quicksort.pdf pdf
Quicksort algorithm Average case analysis After today, you should be able to…
snippets until the list is sorted. For real: https://gkoberger.github.io/stacksort/ } Invented by C.A.R. “Tony” Hoare in 1961* } Very widely used · } Somewhat complex, but fairly easy to · understand · ◦Like in basketball, it’s all · about planting a good pivot.
🌐
Reddit
reddit.com › r/ece › why is the worst case space complexity of quicksort o(n)?
r/ECE on Reddit: Why is the worst case space complexity of Quicksort O(n)?
January 27, 2024 -

Hi,

I have read that the worst case space complexity for Quicksort is O(n), and for the average and best cases it is O(log(n)).

I was watching this video on Quicksort: https://www.youtube.com/watch?v=WprjBK0p6rw

I think the worst case occurs for input [9, 8, 7, 6, 5, 4, 3, 2, 1] with the initial pivot "1".

I don't see how the space complexity translates into O(n).

In the first iteration you load "1" in one CPU register. Then, start the comparison by loading "9" in another register. Since "9" is greater than "1", the algorithm searches for an element which is smaller than "1" in order to swap "9" with that smaller element. As the algorithm cannot find any smaller element, the "1" is swapped with "9". This would be the end result: [1, 8, 7, 6, 5, 4, 3, 2, 9]. Only two registers are needed for all the comparisons.

For the next iteration "2" is selected as the pivot. Since "1" is already smaller than "2", it is left as it is. The end result would be [1, 2, 7, 6, 5, 4, 3, 8, 9]. Again only two registers are used for all the comparisons. So, where is this O(n) space complexity coming from?

If the input size is made bigger, such as [20, 19, 18, 17, 16, 15, 14, 15, 14, 13, 12, 11, 10, 9, 8, 7, 6, 5, 4, 3, 2, 1], even then the same number of registers will be used. In other words the number of registers required for comparisons is the same.

Where am I going wrong with it? Could you please guide me with it?

🌐
GATE Overflow
gateoverflow.in › 522465 › go-classes-dpp-python-&-dsa-mergesort
Programming in Python: GO Classes DPP | PYTHON & DSA |MERGESORT
February 21, 2026 - Consider the Merge Sort algorithm being applied to an array of $n$ elements. Which of the following statements ... and iv only i, iii, and iv only
🌐
GeeksforGeeks
geeksforgeeks.org › dsa › time-complexities-of-all-sorting-algorithms
Time Complexities of all Sorting Algorithms - GeeksforGeeks
September 23, 2016 - Worst Time Complexity: Define the input for which algorithm takes a long time or maximum time. In the worst calculate the upper bound of an algorithm. Example: In the linear search when search data is present at the last location of large data then the worst case occurs. Following is a quick revision sheet that you may refer to at the last minute: Searching and Sorting articles · Previous year GATE Questions on Sorting · Time and Space Complexity of Insertion Sort ·
🌐
Quora
quora.com › What-is-the-time-complexity-of-quicksort-if-there-is-no-good-partitioning-point
What is the time complexity of quicksort if there is no good partitioning point? - Quora
Answer (1 of 2): For an arbitrary pivoting strategy, Quick Sort takes \Theta(n^2) time in the worst case. This is because partition takes \Theta(n) time in the worst case, and a (very) poorly selected pivot (each time) can require \Theta(n) ...
🌐
W3Schools
w3schools.com › dsa › dsa_intro.php
DSA Introduction
For optimizing processes, such as arranging tasks so they can be completed as quickly as possible. For solving complex problems: From finding the best way to pack a truck to making a computer 'learn' from data.
🌐
Medium
medium.com › karuna-sehgal › a-quick-explanation-of-quick-sort-7d8e2563629b
A Quick Explanation of Quick Sort | by Karuna Sehgal | Karuna Sehgal | Medium
February 5, 2018 - Overall time complexity of Quick Sort is O(nLogn). In the worst case, it makes O(n2) comparisons, though this behavior is rare. The space complexity of Quick Sort is O(nLogn). It is an in-place sort (i.e.
🌐
Medium
medium.com › @tobiascandela › quick-sort-algorithm-a-quick-overview-4ad3601ae42a
Quick Sort Algorithm: A Quick Overview | Medium
June 29, 2023 - As a result, the maximum depth of the call stack is proportional to log(n), so the space complexity in the average and best cases of QuickSort is O(log n). ... In the worst-case scenario, the partitioning is highly imbalanced, resulting in n ...
🌐
Mad In America
madinamerica.com › home › ubertherapy and the enshittification of our relational lives: part 2 of our interview with elizabeth cotton
UberTherapy and the Enshittification of our Relational Lives: Part 2 of our Interview with Elizabeth Cotton - Mad In America
February 20, 2026 - It made me think about the complexity for therapists, as opposed to Uber drivers. You’re faced with a constant demand to self-harm, to compensate for a broken system through your own energy, diligence, and professionalism. I don’t want to add to the denigration of Uber therapists.
🌐
Scribd
scribd.com › presentation › 828617364 › Understanding-Quicksort-Algorithm-a-Comprehensive-Guide
Quicksort Algorithm Explained | PDF | Computing | Computer Science
• An efficient, in-place sorting algorithm • Developed by Tony Hoare in 1959 • Uses divide-and-conquer strategy • Average time complexity: O(n log n) Key Characteristics • Recursive algorithm • Partitioning-based sorting • Not ...
🌐
VisuAlgo
visualgo.net › en › sorting
Sorting (Bubble, Selection, Insertion, Merge, Quick, Counting, Radix) - VisuAlgo
The best case scenario of Quick ... is only O(log N). As each level takes O(N) comparisons, the time complexity is O(N log N)....
🌐
Oxford University
mathcenter.oxford.emory.edu › site › cs171 › quickSortAnalysis
Quick Sort Analysis
Quick sort has a high recursive overhead when the arrays being considered are tiny. In the process of recursively calling quick sort on smaller and smaller arrays -- when the arrays are around 10 elements long, we could switch to an insertion sort to improve the overall efficiency.