Sorting Algorithms

Algorithm Description Time Best (Ω) Time Average (Θ) Time Worst (O) Space Worst
Heap Sort Uses a binary heap to extract the maximum (or minimum) element and place it in the sorted array. Ω(n log(n)) Θ(n log(n)) O(n log(n)) O(1)
Merge Sort Divides the array, recursively sorts halves, and merges them. Ω(n log(n)) Θ(n log(n)) O(n log(n)) O(n)
Quick Sort Chooses a pivot, partitions the array, and recursively sorts partitions. Ω(n log(n)) Θ(n log(n)) O(n²) O(log(n))
Tree Sort A sorting algorithm that builds a binary search tree from the elements to be sorted, then traverses the tree to retrieve the elements in sorted order. Ω(n log(n)) Θ(n log(n)) O(n²) O(n)
Comb Sort Improves upon bubble sort by using a gap sequence to eliminate turtles, or small values near the end of the list, which slows down the sorting process in bubble sort. Ω(n log(n)) Θ(n²) O(n²) O(1)
Shell Sort Extension of insertion sort that allows the exchange of items that are far apart. It starts by sorting pairs of elements far apart from each other, then progressively reduces the gap between elements to be compared. Ω(n log(n)) Θ((n log(n))²) O(n(log(n))²) O(1)
Cube Sort Operates by recursively dividing the array into sub-cubes, sorting each cube individually, and then merging them back together, offering a balance between time complexity and space complexity. Ω(n) Θ(n log(n)) O(n log(n)) O(n)
Tim Sort A hybrid sorting algorithm derived from merge sort and insertion sort, designed to perform well on real-world data and exploit existing order in the input sequence. Ω(n) Θ(n log(n)) O(n log(n)) O(n)
Bubble Sort Repeatedly compares adjacent elements and swaps them if they are in the wrong order. Ω(n) Θ(n²) O(n²) O(1)
Insertion Sort Builds the final sorted array one element at a time by inserting each element into its position. Ω(n) Θ(n²) O(n²) O(1)
Selection Sort Finds the smallest (or largest) element and places it at the beginning (or end) of the array. Ω(n²) Θ(n²) O(n²) O(1)
Bucket Sort Divides the input array into a number of buckets, each of which is then sorted individually, typically with another sorting algorithm or by recursively applying bucket sort. Ω(n + k) Θ(n + k) O(n²) O(n)
Counting Sort An integer sorting algorithm that works by determining the number of objects having distinct key values and using arithmetic to determine their position in the output array. Ω(n + k) Θ(n + k) O(n + k) O(k)
Radix Sort Sorts based on individual digits or characters, from least to most significant. Ω(nk) Θ(nk) O(nk) O(n + k)

Senior-level notes

The things an interviewer expects you to know beyond the complexity table.

Heap Sort

Senior insight

The only O(n log n) comparison sort that is strictly in-place (O(1) aux). Not stable. Loses to merge-sort and quicksort in wall-clock time because of poor cache behavior — the sift-down jumps around the array.

Common traps

  • Assuming it is stable — it is not
  • Using it when the array fits in cache and quicksort would be ~2× faster

Follow-up questions

  • ? Why is heap-sort slower than quicksort in practice despite the same asymptotic bound?

Merge Sort

Senior insight

Stable and O(n log n) worst-case — the canonical choice when stability matters or you need guaranteed performance. The O(n) extra space rules it out for memory-constrained environments. External sort (sorting data that does not fit in RAM) is always merge-based.

Common traps

  • Allocating a new temp array on every recursive call — allocate once at the top
  • Forgetting that the merge itself is O(n) — the "log n" comes from the recursion depth, not the merge

Follow-up questions

  • ? How would you sort a 100 GB file on a machine with 16 GB of RAM?
  • ? What makes merge-sort stable, and what would break that?

Quick Sort

Senior insight

Fastest comparison sort in practice on random data — cache-friendly, in-place (ignoring the O(log n) stack). The worst-case O(n²) is what pivot-choice strategies (median-of-three, randomized) defend against. Not stable.

Common traps

  • Using the first / last element as pivot on already-sorted input — degrades to O(n²)
  • Recursing on the larger partition first — blows the stack; always recurse on the smaller
  • Claiming stability — quicksort is not stable unless you go out of your way

Follow-up questions

  • ? How does Introsort combine quicksort with heapsort to guarantee O(n log n)?
  • ? Implement quickselect — how does it differ from quicksort in complexity?

Tim Sort

Senior insight

The default sort in Python, Java (objects), and V8. Exploits "runs" — already-sorted subsequences — giving O(n) on nearly-sorted input. Stable, which is why Java uses it for objects but Dual-Pivot Quicksort for primitives.

Common traps

  • Writing a custom comparator that is not a total order — Timsort (and every comparison sort) will misbehave

Follow-up questions

  • ? Why does Java use Timsort for Object[] but Dual-Pivot Quicksort for int[]?
  • ? What is a "run" in Timsort and how does it accelerate nearly-sorted input?