Article written by Shashi Kadapa, under the guidance of Marcelo Lotif Araujo, a Senior Software Developer and an AI Engineer. Reviewed by Mrudang Vora, an Engineering Leader with 15+ years of experience.
Sorting algorithms are instructions to arrange data in a required order, such as ascending or descending. Time complexity specifies the running time of an algorithm as the input size increases, while space complexity examines the memory consumed for the operation.
The time complexity is important to transform unordered, random data into a structured format. This process increases readability, quick search, and other algorithms are processed efficiently.
Sorting algorithms and time complexity are closely related since they measure run time and input size. This blog examines several important topics on sorting algorithms, time complexity, space complexity, their types, and several important functions and operators.
The blog will help candidates appearing for coding interviews. Read about various types of sorting algorithms and their use cases.
Time complexity is a way to measure how an algorithm’s run time grows with the input size increment. It evaluates the relationship between the scaling of the number of operations with respect to input size n.
Big-O notation expresses this growth rate mathematically in edge cases. It defines the upper bound of an algorithm’s time complexity, allowing for an understanding of the algorithm’s performance as the data increases.
Space complexity measures the extra memory an algorithm requires to run as the input size increases. Big O notation indicates the relation of input n and the memory used.
Time and space complexity are important since they help to understand application behaviour as data increases. Developers can then ensure that the application remains time-efficient, fast, and responsive.
Performance impact of sorting algorithms’ time complexity helps to compare algorithms and balance parameters so that growth rate of time and memory is with reference to input size and not hardware.
Scalability of sorting algorithms and time complexity is the capacity to handle increasing input data. Memory limits and time complexity of different sorting algorithms measure the excess memory needed.
Types of time complexity of sorting algorithms are constant time (O(1), Logarithmic Time (O(logn)), Linear Time (O(n)), Linearithmic / Quasilinear Time (O(nlogn)), Quadratic Time (O(n2)), Cubic Time (O(n3)), Exponential Time (O(n2)), and Factorial Time (O(n!)).
The following table groups the sorting algorithms’ time complexity by best case, average case, and worst case.
| Case | Definition | When It Occurs |
|---|---|---|
| Best Case | The minimum time an algorithm takes for any valid input of size n. | When the input is in the most favorable condition (e.g., element found at first position in search). |
| Average Case | The expected time taken over all possible inputs of size n. | When inputs are random or uniformly distributed; represents typical real-world performance. |
| Worst Case | The maximum time an algorithm can take for any input of size n. | When the input is in the least favorable condition (e.g., element found at last position or not found). |
The best-case sorting algorithm’s time complexity is the minimum time taken by an algorithm to complete for an input of n. It indicates the most favorable instance for an algorithm to give the best performance.
In the time complexity of sorting algorithms, the best case is the lowest running time, and the fewest number of iterations are run.
The average case sorting algorithm time complexity is the expected time for the algorithm to run. All possible inputs of size n and their probabilities are considered. It is a realistic estimate of performance and is obtained by taking scenarios of different inputs.
Average case sorting algorithms and time complexity occur when the data is not uniformly distributed and random. It indicates real use.
Worst-case sorting algorithms and time complexity instances are the maximum time for an algorithm to run for n inputs. Among the time complexities of sorting algorithms, it is the most unfavorable scenario, forcing the algorithm to run the maximum operations.
Worst-case sorting algorithms and time complexity occur when all elements and steps must be completed. It occurs in linear and binary search, and in quick sort.
The following table presents a sorting complexity comparison of 9 algorithms.
| Cases | |||||
|---|---|---|---|---|---|
| Type of Sort Algorithm | Best | Avg | Worst | Space Complexity | Stable |
| Bubble | O(n) (optimized) | O(n²) | O(n²) | O(1) | Yes |
| Selection | O(n²) | O(n²) | O(n²) | O(1) | No |
| Insertion | O (n) | O(n²) | O(n²) | O ( 1) | Yes |
| Merge | O (n log n) | O (n log n) | O (n log n) | O (n) | Yes |
| Quick | O (n log n) | O (n log n) | O(n²) | O (log n) | No |
| Heap | O (n log n) | O (n log n) | O (n log n) | O (1) | No |
| Counting | O(n + k) | O(n + k) | O(n + k) | O(k) | Yes |
| Radix | O (nk) | O (nk) | O (nk) | O (n + k) | Yes |
| Bucket | O (n + k) | O(n + k) | O(n²) | O (n + k) | Yes |
Note:
n is the number of elements
k is the range of input such as the max value
Stable Sorting: Maintains relative order of equal elements. Types are Bubble, Insertion, Merge, Counting, Radix, Bucket
Not Stable are Selection, Quick, Heap
The above figure gives a comparison of the complexity charts of sorting algorithms. Color coding shows the efficiency with green for more efficient, and red for expensive.
The X-axis represents the input size with the number of elements sorted. As you move right, the array increases.
The y-axis indicates the operation time/steps. It means the number of operations left. Higher values mean more effort. As you move up, the algorithm does more work.
The performance of sorting algorithms is tested by time complexity. This section examines the time complexity for common sorting algorithms.
The following table compares the three cases and the value for bubble sort.
| Complexity Type | Value |
|---|---|
| Best Case | O(n), Comparisons = n − 1, Swaps = 0 |
| Average Case | O(n²), Comparisons ≈ n(n − 1)/2, Swaps ≈ n(n − 1)/4 |
| Worst Case | O(n²), Comparisons = n(n − 1)/2, Swaps = n(n − 1)/2 |
| Space | O(1) (in-place, no extra memory) |
Time Complexity
Due to nested loops that compare adjacent elements, bubble sort has a worst-case and average-case time complexity of O(n)2. The best-case scenario with an already sorted array is O(n) with an optimized approach with a flag to detect early completion. It is an in-place algorithm with O(1) auxiliary space complexity.
Space Complexity
The space complexity of bubble sort is O(1) or constant space. It is an in-place sorting algorithm and needs a fixed, small extra memory as a single temporary variable to swap elements regardless of the input data size.
The following table compares the three cases and the values for selection sort.
| Complexity Type | Value |
|---|---|
| Best Case | O(n²), Comparisons = n(n − 1)/2, Swaps = n − 1 |
| Average Case | O(n²), Comparisons = n(n − 1)/2, Swaps ≈ n − 1 |
| Worst Case | O(n²), Comparisons = n(n − 1)/2, Swaps = n − 1 |
| Space | O(1) (in-place, no extra memory) |
Time Complexity
The time complexity of the selection sort algorithm is O(n2). It does not require extra memory space other than the temporary variable memory needed for swapping.
Space Complexity
Space Complexity is O(1) and extra space is not needed for the Selection sort algorithm.
The following table compares the three cases and the values for insertion sort.
| Complexity Type | Value |
|---|---|
| Best Case | O(n), Comparisons = n − 1, Shifts = 0 |
| Average Case | O(n²), Comparisons ≈ n²/4, Shifts ≈ n²/4 |
| Worst Case | O(n²), Comparisons = n(n − 1)/2, Shifts = n(n − 1)/2 |
| Space | O(1) (in-place, no extra memory) |
Time Complexity
The time complexity of insertion sort depends on the extent of sorting or ordering done in the initial data. The worst-case performance is quadratic, but it is highly efficient for already sorted data.
Space Complexity
Space complexity of insertion sort is always O(1), indicating constant additional space. It uses in-place sorting and arranges elements in the existing input structure without the need for additional data structures.
Further Reading: Binary Insertion Sort
The following table compares the three cases and values for the merge sort algorithm.
| Complexity Type | Value |
|---|---|
| Best Case Time | O(n log n) |
| Average Case Time | O(n log n) |
| Worst Case Time | O(n log n) |
| Space Complexity | O(n) |
Time Complexity
Merge sort has a consistent time complexity of O(nlogn) for all cases. The consistent nature occurs since it always divides the array in half with linear time to merge the subarrays. Merge sort is a stable, divide-and-conquer algorithm with a space complexity of O(n).
Space Complexity
Space complexity of merge sort is O(n) and needs auxiliary space in relation to input size, allowing the creation of temporary subarrays in the merging process. Recursive stack has O(logn), and the merging phases need extra O(n) memory.
Further Reading: Merge Sort Algorithm
The following table compares the three cases and values for quick sort algorithm
| Complexity Type | Value |
|---|---|
| Best Case Time | O(n log n) |
| Average Case Time | O(n log n) |
| Worst Case Time | O(n²) |
| Space Complexity | O(log n) |
Time Complexity
The time complexity of the Quick Sort algorithm depends on the manner in which a pivot divides the array during the recursive steps. Worst case is avoided with randomized pivot, and using the median value, or introsort.
Space Complexity
Space complexity of Quick Sort is determined by the recursion stack depth. The algorithm is implemented in-place, and extra arrays for sorting are not needed.
Further Reading: Quicksort Algorithm
The following table compares the three cases and values for heap sort algorithm
| Case / Metric | Complexity & Value |
|---|---|
| Best Case | Time: O(n log n) |
| Average Case | Time: O(n log n) |
| Worst Case | Time: O(n log n) |
| Space Complexity | O(1) (in-place) |
Time Complexity
The time complexity of Heap Sort is O(nlogn) for all cases. In the worst case, it does not degrade to O(n2) performance.
Space Complexity
The time complexity of Heap Sort is O(nlogn) for all cases. In the worst case, it does not degrade to O(n2) performance. It takes a small extra variable for temporary storage of indices while swapping.
Further Reading: Heap Sort Algorithm
The following table compares the three cases and values for counting sort algorithm
| Case / Metric | Complexity & Value |
|---|---|
| Best Case | Time: O(n + k) |
| Average Case | Time: O(n + k) |
| Worst Case | Time: O(n + k) |
| Space Complexity | O(n + k) |
Time Complexity
The time complexity of counting sort is O(n+k), where k is the input value range, and n is the number of input array elements.
Space Complexity
The space complexity of counting sort is O(n+k). It is obtained from the requirement for two auxiliary structures during the sorting process: the count array and output array.
The following table compares the three cases and values for radix sort algorithm
| Case / Metric | Complexity & Value |
|---|---|
| Best Case | Time: O(d · (n + k)) |
| Average Case | Time: O(d · (n + k)) |
| Worst Case | Time: O(d · (n + k)) |
| Space Complexity | O(n + k) |
Time Complexity
The time complexity of the Radix Sort algorithm is O(d.(n+k)). k is the base or radix number system used, such as k = 2 for binary. All digits are positioned with a table sorting subroutine, such as Counting Sort. This feature allows processing of each digit of all elements irrespective of the initial order or any case.
Space Complexity
Space complexity of Radix Sort is O(n+k) with k as the base or digits of the number system, such as K = 10 for decimals. Radix needs additional memory for the output array, and count or bucket array.
The following table compares the three cases and values for bucket sort algorithm
| Case / Metric | Complexity & Value |
|---|---|
| Best Case | Time: O(n + k) |
| Average Case | Time: O(n + k) |
| Worst Case | Time: O(n²) |
| Space Complexity | O(n + k) |
Time Complexity
The time complexity of the bucket sort algorithm depends on the distribution of input data and the number of buckets. Buckets are created by distributing the elements uniformly and sorting them with sorting algorithms.
Space Complexity
Space complexity is O(n+k), where k is the number of buckets. Additional or auxiliary space is needed for bucket storage and elementary distribution
While choosing the appropriate sorting algorithm, four factors must be considered. These are scenarios, data size, its nature, whether it is sorted, and the system constraints, such as the need for stability and limited memory.
Interview questions are on explaining the choice of the right sorting algorithms’ time complexity. The following table details the factors for selecting the right sorting algorithm.
| Scenario | Best Choice | Reason |
| General Purpose / Large Data | Quicksort | Fast due to low overhead and good cache locality. |
| Stability Required | Merge Sort | Guarantees relative order of equal elements and stable time. |
| Small Datasets () | Insertion Sort | Extremely low overhead; often faster than “efficient” sorts for tiny arrays. |
| Nearly Sorted Data | Insertion Sort | Approaches linear time when only a few elements are out of place. |
| Memory Constrained | Heap Sort | In-place sorting with guaranteed and zero extra memory needed. |
| Massive Data (External) | External Merge Sort | Best for data that doesn’t fit in RAM and must be sorted on disk. |
| Linked Lists | Merge Sort | Efficient because it doesn’t require random access. |
Further Reading: Quicksort vs Merge Sort
A cheat sheet for sorting algorithms’ time complexity helps you in cracking sorting algorithms and time complexity interviews. The following cheat sheet presents the sorting algorithms’ time and space complexity.
| Algorithm | Best | Average | Worst | Space | Stable | In-Place |
|---|---|---|---|---|---|---|
| Bubble Sort | O(n) | O(n²) | O(n²) | O(1) | Yes | Yes |
| Selection Sort | O(n²) | O(n²) | O(n²) | O(1) | No | Yes |
| Insertion Sort | O(n) | O(n²) | O(n²) | O(1) | Yes | Yes |
| Merge Sort | O(n log n) | O(n log n) | O(n log n) | O(n) | Yes | No |
| Quicksort | O(n log n) | O(n log n) | O(n²) | O(log n) | No | Yes |
| Heap Sort | O(n log n) | O(n log n) | O(n log n) | O(1) | No | Yes |
| Counting Sort | O(n+k) | O(n+k) | O(n+k) | O(k) | Yes | No |
| Radix Sort | O(nk) | O(nk) | O(nk) | O(n+k) | Yes | No |
| Bucket Sort | O(n+k) | O(n+k) | O(n²) | O(n+k) | Yes | No |
The blog ‘Sorting Algorithms Time Complexity: Big-O & Space Complexity’ answered several important questions on sorting algorithms’ time complexity. The space and time complexity of sorting algorithms is a key topic in interviews.
The best, average, and worst cases and space for all sorting algorithms were compared along with the values. A cheat sheet of all the sorting algorithms and the conditions for selecting them was presented.
Study, read, practice coding, and attend mock interviews to crack software developer interviews.
Among the time complexities of sorting algorithms, Merge Sort, Heap Sort, and Quick Sort provide the optimal time complexity of O(n log n) for comparison-based sorting. The reason is that these algorithms consistently meet the theoretical lower bound of O(n log n) for any comparison-based sorting algorithm.
Quick Sort is the fastest in practice. The reason is that Quick Sort has an O(n log n) average time complexity with good cache performance and works on contiguous memory. It has low constant factors compared to others.
Space complexity is the additional memory needed by a sorting algorithm to run. It is needed to measure memory use with speed, and it is critical when large data sets are processed and when the system has limited memory. It is used to select between in-place and stable/extra-space algorithms.
Merge Sort is the most stable sorting algorithm. The reason is that while merging, it retains the relative order of equal elements, and stability is a part of the merge process. Merge sort is preferred over others since it has stability + O(n log n) efficiency.
In sorting algorithms time complexity interviews, use Quick Sort and Merge Sort. The reason is that Quick Sort is fast, and tests understanding of partitioning and recursion. Merge sort uses O(n log n) and is stable, applied to linked lists and divide-and-conquer questions.
Time Zone:
100% Free — No credit card needed.
Time Zone:
Master ML interviews with DSA, ML System Design, Supervised/Unsupervised Learning, DL, and FAANG-level interview prep.
Get strategies to ace TPM interviews with training in program planning, execution, reporting, and behavioral frameworks.
Course covering SQL, ETL pipelines, data modeling, scalable systems, and FAANG interview prep to land top DE roles.
Course covering Embedded C, microcontrollers, system design, and debugging to crack FAANG-level Embedded SWE interviews.
Nail FAANG+ Engineering Management interviews with focused training for leadership, Scalable System Design, and coding.
End-to-end prep program to master FAANG-level SQL, statistics, ML, A/B testing, DL, and FAANG-level DS interviews.
Learn to build AI agents to automate your repetitive workflows
Upskill yourself with AI and Machine learning skills
Prepare for the toughest interviews with FAANG+ mentorship
Time Zone:
Join 25,000+ tech professionals who’ve accelerated their careers with cutting-edge AI skills
25,000+ Professionals Trained
₹23 LPA Average Hike 60% Average Hike
600+ MAANG+ Instructors
Webinar Slot Blocked
Register for our webinar
Learn about hiring processes, interview strategies. Find the best course for you.
ⓘ Used to send reminder for webinar
Time Zone: Asia/Kolkata
Time Zone: Asia/Kolkata
Hands-on AI/ML learning + interview prep to help you win
Explore your personalized path to AI/ML/Gen AI success
The 11 Neural “Power Patterns” For Solving Any FAANG Interview Problem 12.5X Faster Than 99.8% OF Applicants
The 2 “Magic Questions” That Reveal Whether You’re Good Enough To Receive A Lucrative Big Tech Offer
The “Instant Income Multiplier” That 2-3X’s Your Current Tech Salary
Join 25,000+ tech professionals who’ve accelerated their careers with cutting-edge AI skills
Join 25,000+ tech professionals who’ve accelerated their careers with cutting-edge AI skills
Webinar Slot Blocked
Time Zone: Asia/Kolkata
Hands-on AI/ML learning + interview prep to help you win
Time Zone: Asia/Kolkata
Hands-on AI/ML learning + interview prep to help you win
Explore your personalized path to AI/ML/Gen AI success
See you there!