Interview Kickstart has enabled over 3500 engineers to uplevel.

Sorting algorithm questions are a big part of coding interviews at tech companies. In this article, we’ve covered all the key information you need to know for your interview preparation, including:

- What Are Sorting Algorithms?
- Importance of Sorting Algorithms
- Characteristics of Sorting Algorithms
- Trade-Offs of Sorting Algorithms
- Common Types of Sorting Algorithms
- Simple Sorting Algorithms
- Efficient Sorting Algorithms
- Non-comparison Sorting Algorithms
- Time Complexities of Sorting Algorithms
- Space Complexities of Sorting Algorithms
- Stable and Unstable Sorting Algorithms
- How to Choose Sorting Algorithms for a Particular Problem
- Frequently Asked Questions

Sorting algorithms are algorithms that rearrange the elements of a list in a specific order.

The order of the sorted list is decided by how elements are compared. The two most common orders of sorting a list are lexicographical order (or dictionary order) and numerical order (non-increasing and non-decreasing order). Have a look at some examples:

**Input list:** 7 5 2 8 10 2

**Order:** Non-Decreasing Order**Sorted list:** 2 2 5 7 8 10

**Input list:** e v e r y t h i n g**Order:** Lexicographic Order

Sorted list:

**Input list:** 1 4 5 2 3 5

**Order:** Non-Increasing Order**Sorted list:** 5 5 4 3 2 1

Sorting algorithms are the backbone of many programming solutions. They can be used to:

- Reduce the complexity of many problems.
- Make raw data meaningful and easier to analyze.
- Make it easier to implement search algorithms to find or retrieve an item from a dataset efficiently.

**Time Complexity:**Time complexity is the amount of time an algorithm takes to run. It is an approximation of the total number of elementary operations (arithmetic/bitwise instructions, memory referencing, control flow, etc.) executed throughout the program. It is assumed that each elementary operation takes a fixed amount of time. An algorithm is time-efficient if it takes fewer operations to complete the task. The time complexity of common sorting algorithms varies from O(n) to O(n^2).

**Space Complexity:**Space complexity describes the amount of memory space required to perform the task as a function of the characteristics of the input. An algorithm is more efficient in terms of space complexity if it requires less amount of extra memory space. The space complexity of most of the Sorting Algorithms varies from O(1) to O(n). For example, merge sort takes extra O(n) space, whereas bubble sort only takes extra O(1) space.

**Stability:**A sorting algorithm is stable if the relative order of all pairs of identical elements in the original list and the sorted list stays the same. Often it is required that the identical elements maintain their relative order at the end of the sorting process, and in such cases, we must use a stable sorting algorithm. For example, merge sort is a stable sorting algorithm, whereas quicksort isn’t.

**Comparison Sort:**Some sorting algorithms are comparison sort while others are not. A comparison sort algorithm sorts the elements only by comparing pairs of elements with a comparison operator. For example, heap sort is a comparison sort, whereas counting sort isn’t.

**Parallelism:**Sorting algorithms are either serial or parallel. A parallel algorithm can do multiple operations at a time. For example, we can easily implement bucket sort to sort buckets in parallel.

**Adaptability:**An adaptive algorithm is an algorithm whose runtime is affected by the nature of the elements in the input sequence. For example, bubble sort is an adaptive algorithm, whereas merge sort isn’t.

**Recursion:**Sorting algorithms can be recursive, non-recursive, or both.

The efficiency of an algorithm is mainly determined by two factors — space and time complexity. We always want to reduce the time complexity as well as space complexity to make the algorithm more efficient.

However, it is not always possible to achieve this. Generally, there is a trade-off between time and space complexity — when we try to reduce one, the other factor increases. For example, let’s analyze what happens if we try to make merge sort an in-place sort:

The merge sort algorithm takes O(n * log(n)) time and O(n) space to sort a list of size n.

- We want to reduce the space complexity to O(1), making it an in-place sorting algorithm.
- The best approach to merge two sorted lists in an in-place merge sort takes O(n * log(n)) time, whereas it takes only O(n) time in the typical implementation of the merge sort, which isn’t an in-place sorting algorithm.
- So the time complexity of the Merge sort increases by a factor of log(n) when we reduce the space complexity to O(1).

This shows the trade-off between time and space complexity.

Following are the common types of sorting algorithms:

Simple sorting algorithms are called so because they’re pretty easy to understand and implement. These algorithms are seldom used to sort large lists because of their quadratic time complexity. However, while sorting a small list, the runtime doesn’t matter much, and we often use them for easier and quick implementation.

Bubble sort is one of the simplest sorting algorithms. It repeatedly swaps the adjacent elements if they are in the wrong order until the list is sorted. In every pass, we traverse the whole list and swap adjacent elements in the wrong order.

Bubble sort is named so because in every iteration, the elements with greater value than their surrounding “bubble” towards the end of the list.

To understand how bubble sort works, check out our article“Bubble Sort Algorithm.”

Selection sort is a simple sorting algorithm. The algorithm divides the list into two sublists: sorted and unsorted. Initially, the sorted list is empty, and the unsorted list is the whole input list.

At each step, we find the minimum element in the unsorted list and swap it with the leftmost element of the unsorted list. This process increases the sorted sublist’s size by 1 and decreases the unsorted list’s size by 1, moving the lists’ border by 1.

Read our article onSelection Sortto get an in-depth understanding of the algorithm.

Insertion sort is a simple sorting algorithm. The algorithm divides the list into two sublists: sorted and unsorted. Initially, the sorted list is empty, and the unsorted list is the whole input list. At each step, we pick the leftmost element in the unsorted list and place it at the correct position in the sorted list.

For examples, pseudocode, and code, check out“Insertion Sort Algorithm.”

Efficient algorithms are called so because they have the best asymptotic time complexity of O(n logn) any comparison-based sorting algorithm can have. Considering only the time complexity, they can be used to sort lists of all kinds, but in practice, they are mainly used to sort large lists.

Merge sort is one of the most efficient and popular sorting algorithms. Merge sort uses the concept of divide and conquer. The algorithm divides the unsorted list into n sublists, each of size one, which can be considered sorted trivially.

The algorithm then repeatedly merges the sorted sublists to produce new sorted sublists until there is only one sorted sublist of size n remaining. This final sorted sublist of size n is the sorted list.

To learn more, read our article on the“Merge Sort Algorithm.”

Heap sort is an efficient comparison-based sorting algorithm. It is very similar to the selection sort algorithm. Like selection sort, heap sort divides the list into a sorted and unsorted region. At each step, the algorithm extracts the maximum element from the unsorted region and places it at the beginning of the sorted region.

Unlike selection sort, heap sort does not find the maximum element by traversing the whole unsorted region; instead, it maintains the unsorted region in a heap data structure and extracts the maximum element in logarithmic time.

The heap data structure satisfies the heap-order property, i.e., the value at each node is greater than or equal to its children. However, after extracting the maximum element from the heap, the heap doesn’t follow the heap-order property. So, we need to rearrange the heap to restore the heap-order property. This process of rearranging is called heapify.

Read our article onHeap Sortto understand the algorithm in detail.

Quicksort is an efficient comparison-based in-place sorting algorithm. It is also called partition-exchange sort. Like merge sort, It is a divide-and-conquer algorithm.

The algorithm chooses a pivot element and partitions the list into two halves. The left half contains all the elements smaller than the pivot element, and the right half has all the elements greater than the pivot element. The algorithm then sorts both halves recursively.

There are various versions of the quicksort algorithm, depending upon what we choose as the pivot element.

For a detailed review of quicksort, check out the“Quicksort Algorithm”article.

Non-comparison sorting algorithms are sorting algorithms that don’t compare elements to sort the list. It uses other techniques such as counting the number of elements having distinct key values or using the internal character of the elements to sort the list.

Counting sort is a non-comparison-based sorting algorithm. It counts the number of occurrences of each unique element of the list in an auxiliary array, calculates its prefix sum array, and then traverses the input array and assigns them at their correct position in the output array.

Read our article onCounting Sort— it covers everything from examples to code.

Radix sort is a non-comparison-based sorting algorithm. It sorts the list digit by digit. It considers the digit in the order from the least significant to the most significant (LSD to MSD) or the most significant to the least significant (MSD to LSD). It uses the Counting Sort algorithm to sort the list considering a certain digit with some certain place value.

We have a detailed article onRadix Sortif you wish to deep dive into it.

Bucket sort is a non-comparison-based sorting algorithm. It is also known as bin sort. It divides the list into several groups called buckets by appropriately mapping each element to a bucket.

After that, it sorts each bucket using any appropriate sorting algorithm or by calling itself recursively and then gathers the bucket to get the sorted list. The mapping function is a function of the characteristic of input. One common mapping function is f(x) = floor( x * n / M), where M is the maximum element of the input list and n is the total number of buckets.

Have a look at ourBucket Sortarticle for examples, pseudocode, and code.

As mentioned before, the time complexity of an algorithm tells the amount of computer time taken by an algorithm to run. It is estimated by the number of elementary operations an algorithm takes to complete its task in terms of characteristics of the input, assuming that it takes a fixed amount of time to execute each elementary operation.

Time complexities of the discussed sorting algorithms:

The space complexity of an algorithm is the amount of space required by the algorithm to complete its task. It is described in terms of the characteristics of the input: the size of the input, the range of the input etcetera.

An in-place algorithm processes the input and produces the output in the same memory location that contains the input without using any auxiliary space.

Space complexities of the discussed sorting algorithms:

For more details on time and space complexity of each of these algorithms, read:“Time and Space Complexities of Sorting Algorithms Explained”

A sorting algorithm is stable if the relative order of any two equal elements in the original list and the sorted list stays the same.

**Input: **

*son sugar dog duck*

If we sort the input by just the first letter

**A stable sort would always produce:**

*dog duck son sugar*

**An unstable sort might produce:**

*duck dog sugar son *

The first letter of “dog” and “duck” is the same. However, dog appears before duck in the input. In the output produced by a stable sort, dog appears before duck. The same goes with son and sugar.

Here are the different algorithms and their stability:

While solving a problem, it’s crucial to know what is important in the problem.

To choose the best sorting algorithm for a particular problem, we should consider the following factors:

- Time complexity
- Space complexity
- Stability
- Characteristics of input such as the size, whether it’s almost sorted, or it’s unsorted to a large extent, etc.

- When stability is important
- When the input size is small, or the elements are nearly sorted.

- When stability isn’t important
- When the list is small and swapping two elements is a costly operation
- When the memory space is limited

- When stability is important
- When the list is small, or the elements are almost sorted
- When the memory space is limited

- When stability is important
- When random access is very expensive compared to sequential access
- When random access is not supported by the data structure like a linked list

- When stability isn’t important
- When memory space is limited
- When the guaranteed performance of O(log n) is expected

- When stability isn’t important
- When memory space is limited
- When the dataset isn’t very large, that is, it fits in memory.
- Although its worst-case time complexity is O(n ^ 2), it tends to be fast in practice and is a good default choice. The worst-case occurs rarely.

- When stability is important
- When the range of the elements in the list is limited, and the elements are repeating a lot
- When the range of the elements in the list is of the order of the size of the list

- When stability is important
- When elements are not so much repeated, but the length of the elements in the list is very small. For example, it can be used for sorting a million 6 digit numbers or sorting a thousand strings containing at most 3 characters

- When the elements are uniformly distributed over a range
- When we can utilize a large degree of parallelism because each bucket can be sorted in parallel with one another

**Here's a cheat sheet for quick reference:**

**Question 1: Can any unstable sorting algorithm be made stable?**

Answer: Yes, Any sorting algorithm can be made stable. There can be specific ways to make any particular sorting algorithm stable, but we need a general strategy that can be employed to make any sorting algorithm stable. We can achieve this by considering every element as a pair of its value and its position in the input. We first compare the elements by their value, and if the value turns out to be equal, we compare them by their position. As the position of every element is unique, no two pairs can be identical.

**Example:**

**Input: son sugar dog duck**

Modified input: (son, 1) (sugar, 2) (dog, 3) (duck, 4)

Now sort the words by their first letter.

Any sorting algorithm (stable or unstable) would produce:

(dog, 3) (duck, 4) (son, 1) (sugar, 2)

Now remove the position from the pair

**Output: dog duck son sugar**

**(dog, 3) and (duck, 4): **The first character of both words is the same, but 3 is less than 4, so (dog, 3) will appear before (duck, 4), maintaining their relative position, thus making the algorithm stable. The same goes for (son, 1) and (sugar, 2).

**Question 2: What is the difference between internal and external sorting?**

Answer:** **If all the data to be sorted is stored in the memory throughout the sorting process, the sorting is called internal sorting. In external sorting, data is stored outside the memory (like on disk) and is loaded in small chunks into the memory. External sorting is generally applied when the dataset is very large and can’t fit into the memory.

Commonly, the merge sort algorithm is used for external sorting because it divides sorting into sorting many smaller chunks that fit into the memory. It doesn’t require random access to the dataset.

If you’re looking for guidance and help with getting your prep started, sign up for our free webinar. As pioneers in the field of technical interview prep, we have trained thousands of software engineers to crack the toughest coding interviews and land jobs at their dream companies, such as Google, Facebook, Apple, Netflix, Amazon, and more!

----------

*Article contributed by Taara Sinh Aatrey*