Interview Kickstart has enabled over 3500 engineers to uplevel.

Are you preparing for a coding interview?

Brushing up on data structures and sorting algorithms is a big part of interview prep for software engineers. Among sorting algorithms, quicksort is a must-learn technique. This article will cover all the basics to help you review:

- What Is Quicksort?
- How Quicksort Works (With Pseudocode)
- Quicksort Algorithm
- Quicksort Code
- Quicksort Complexity
- Advantages of Quicksort
- Disadvantages of Quicksort
- Quicksort FAQs

Quicksort is one of the most efficient and commonly used sorting algorithms. It is a divide-and-conquer algorithm. It’s also an in-space algorithm as it does not use any extra space for sorting the input data, unlike the merge sort algorithm.

And don’t worry when you hear someone say “**Partition Exchange Sort**” — it’s just another name for quicksort.

Quicksort is used in:

**Commercial computing:**Various government and private organizations use quicksort. For example: for sorting students profiles by student name or enrolment number; for sorting employee profiles by employee ID; for sorting transactions by time or date**Information searching:**It is widely to make searching of information faster and better**Numerical computations and scientific research:**Many efficiently developed algorithms use quicksort for sorting data

Quicksort works by dividing a large array into smaller arrays and then sorting those smaller arrays. A pivot element is used to partition a large array into smaller arrays; smaller arrays are divided recursively until an array with only one or no elements is created.

**Quicksort implementation requires two main methods:**

**quickSort():**Selects a pivot element, partitions the array around the pivot element, and recursively sorts the partitioned subarrays.**partition():**Assigns the pivot element and places the pivot element at its right position in the sorted array. Then, it places all smaller elements to the left of the pivot and places all larger or equal elements to the pivot’s right.

There are several different ways of array partitioning and pivot selection. We’ll cover these in the following sections.

Lomuto partition scheme selects a pivot element, which is typically the last element of the array. While partitioning a range [start, end], the algorithm maintains index i as it scans the array.

Necessary swaps are made to ensure that after every iteration:

- The elements from index “start” to i-1 (inclusive) contain the numbers smaller than the pivot
- The elements from index i through the last scanned number (inclusive) are equal to or larger than the pivot

Hoare partition scheme uses two pointers — start and end. “start” is assigned the leftmost index, and “end” is assigned the rightmost index of the current partition. Both pointers move towards each other until a wrong pair is encountered. By a wrong pair, we mean:

- A pair consisting of an element greater than or equal to pivot at the first pointer ...
- … and an element smaller than or equal to pivot at the second pointer

The elements of the wrong pair are then swapped. This continues until “start” and “end” meet each other.

In the Hoare partition scheme, you can choose any element of the array as the pivot. But selecting the last element as the pivot is mostly avoided because it may lead to infinite recursion. For example, if the given array is {2, 9, 6, 11} and pivot is arr[end], then the returned index will always be equal to end, and a call to the same quickSort method will be made infinite times.

Note:In this scheme, the pivot’s final location is not necessarily at the returned index, and the subsequent two partitions that the quickSort recurs on are (start to Pi) and (Pi+1 to end) as opposed to (start to Pi-1) and (Pi+1 to end hi) as in Lomuto’s scheme.

- Select the leftmost element
- Select the rightmost element
- Select a random element
- Select the middle index
- Select the median of the first, middle and last element

In this article, we will be using the Lomuto partition scheme, as it is more compact and easy to understand. You can use the Hoare partition scheme if you want a more efficient approach.

We will be selecting the rightmost element as the pivot.

**Steps:**

**Step 1:**Select a pivot element**Step 2:**Partition the array around the pivot element.

Move all the elements < pivot to the left of the pivot and move all elements >= pivot to the pivot’s right**Step 3:**After**Step 2,**the pivot element is in its correct position**Step 4:**Apply the quicksort recursively on the left partition**Step 5:**Apply the quicksort recursively on the right partition**Step 6:**Stop the recursion when the base case is reached. The base case is an array of size zero or one

**Example:**

Given array = {40, 21, 8, 17, 51, 34}

Sorted array = {8, 17, 21, 34, 40, 51}

We’ve used Java to demonstrate the code. Use this as a reference to code in C, C++, Python, or any programming language of your choice.

**Output: **

Original array - 2 11 9 4 13 5

Array after sorting - 2 4 5 9 11 13

arr[] = {2, 11, 9, 4, 13, 5}

start = 0, end = 5, pivot = arr[end] ={5}

i = (start - 1) = -1 (index of smaller element), j = 0 (loop variable)

Traverse elements from start to end

**Pass 1:**

j = 0

i = 0

Since arr[j] < pivot, i++ and swap (arr[i], arr[j] )

arr[] = {2, 11, 9, 4, 13, 5} // no change as arr[i] = arr[j] = 2

**Pass 2:**

j = 1

i = 0:

Since (arr[j] > pivot) //no change

arr[] = {2, 11, 9, 4, 13, 5}

**Pass 3:**

j = 2

i = 0:Since (arr[j] > pivot) //no change

arr[] = {2, 11, 9, 4, 13, 5}

**Pass 4:**

j = 3

i = 1

Since arr[j] < pivot , i++ and swap (arr[i], arr[j])

arr[] = {2, 4, 9, 11, 13, 5} // swapped arr[1] with arr[3]

**Pass 5:**

j = 4

i = 1

Since arr[j] > pivot //no change

arr[] = {2, 4, 9, 11, 13, 5}

j = 5 > end -1. So swap arr[i+1] and pivot

arr[] = {2, 4, 5, 11, 13, 9} //swapped arr[2] and pivot

arr[] = {2, 4, 5, 11, 13, 9}

start = 0, end = (pivot - 1) = 1, pivot = arr[end] ={4}

i = (start - 1) = -1 (index of smaller element), j = 0 (loop variable)

**Pass 1:**

i=-1

j=0

Since arr[j] < pivot , i++ and swap (arr[i], arr[j])

arr[] = {2, 4, 9, 11, 13, 5} // no change as arr[i] = arr[j] = 2

j = 1 > end -1. So swap arr[i+1] and pivot

arr[] = {2, 4, 5, 9, 13, 11} // no change as arr[i+1] = pivot = 4

Now, quickSort will not make this recursive call further because the left partition contains only one element (2) and the right partition contains no elements; there is no element greater than 4 between indices start = 0 to end = 1.

arr[] = {2, 4, 5, 11, 13, 9}

start = (pivot + 1) = 3, end = 5, pivot = arr[end] ={9}

i = 2 (index of smaller element), j = 3 (loop variable)

Traverse elements from start to end

**Pass 1:**

j = 3

i = 2:Since (arr[j] > pivot) //no change

arr[] = {2, 4, 5, 11, 13, 9}

**Pass 2:**

j = 4

i = 2:Since (arr[j] > pivot) //no change

arr[] = {2, 4, 5, 11, 13, 9}

j = 5 > end -1. So swap arr[i+1] and pivot

arr[] = {2, 4, 5, 9, 13, 11} //swapped arr[3] and pivot

quickSort will not recursively sort the left partition further because it does not contain any element; there is no element less than 9 between indices start = 3 to end = 5.

**So, let's sort the right partition of this recursive call:**

arr[] = {2, 4, 5, 9, 13, 11}

start = 4, end = 5, pivot = arr[end] ={11}

i = 3(index of smaller element), j = 4 (loop variable)

Traverse elements from start to end

**Pass 1:**

j = 4

i = 3:Since (arr[j] > pivot) //no change

arr[] = {2, 4, 5, 9, 13, 11}

j = 5 > end -1. So swap arr[i+1] and pivot

arr[] = {2, 4, 5, 9, 11, 13} //swapped arr[4] and pivot

The array is now sorted.

Let’s assume that the array to be sorted has n elements.

**Best case:****O(n*logn)**

This happens if we pick the median of the array as the pivot element every time. The size of subarrays will be half the size of the original array. In this case, the time taken will be O(n*logn).**Worst case:****O(n^2)**

This happens if we pick the largest or the smallest element of the array as the pivot element every time. The size of the subarray after partitioning will be n-1 and 1.**Average case:****O(n*logn)**

This is the average time taken for all n! permutations of n elements.

Quicksort is a recursive algorithm, and the space complexity depends on the size of the recursive tree. A stack is used to store the recursive tree.

**Best case:****O(logn)**

This happens when the pivot element’s correct position in the partitioned array is at the middle every time. The size of subarrays will be half the size of the original array. In this case, the recursive tree’s size will be O(log(n)).**Worst case:****O(n)**

This happens when the pivot element is the largest or smallest element of the array in every recursive call. The size of the subarray after partitioning will be n-1 and 1. In this case, the size of the recursive tree will be n.

- Is in-place, as it uses only a small auxiliary stack
- Requires only n (log n) time to sort n items on an average case

- Is recursive; if recursion is not available, quicksort implementation is extremely complicated
- Requires quadratic (i.e., n^2) time in the worst case
- Is not stable

**Question 1: Which is better — merge sort or quicksort?**

**Answer**: Quicksort is an in-space sorting algorithm as it doesn’t require any additional space to sort the given array. On the other hand, merge sort requires an auxiliary array to merge the sorted sublists. Hence, we can say that the quicksort has an advantage over the merge sort in terms of space.

**Question 2: Is quicksort a stable sorting algorithm?**

**Answer: **Efficient implementation of quicksort is not stable because it swaps non-adjacent elements, and the relative order of equal array elements may change. For example, suppose there are two 5s in the array. Let's call the first 5 “5a” and the second one “5b”. After sorting, 5b can be at a lower index than 5a. In this article, we have used the non-stable approach. Quicksort can be made stable, but it will require extra O(n) space.

Sorting algorithms interview questions feature in almost every coding interview for software developers. If you’re looking for guidance and help to nail these questions and more, sign up for our free webinar.

As pioneers in the field of technical interview prep, we have trained thousands of software engineers to crack the toughest coding interviews and land jobs at their dream companies, such as Google, Facebook, Apple, Netflix, Amazon, and more!

---------

*Article contributed by Abhinav Jain*