Sorting algorithms arrange elements in a specific order, typically ascending or descending. Understanding their characteristics helps choose the right algorithm for different scenarios.
Bubble sort repeatedly swaps adjacent elements if they are in wrong order, with time complexity O of n squared for average and worst cases and O of n for best case when array is already sorted. Space complexity is O of 1. It is simple but inefficient for large datasets and rarely used in practice.
Selection sort finds the minimum element and places it at the beginning, repeating for remaining elements. Time complexity is O of n squared for all cases and space complexity is O of 1. It performs fewer swaps than bubble sort but still inefficient for large data.
Insertion sort builds sorted array one element at a time by inserting each element into its correct position. Time complexity is O of n squared for average and worst cases, O of n for nearly sorted data, and space complexity is O of 1. It is efficient for small datasets and nearly sorted data, used in hybrid algorithms like TimSort.
Merge sort divides array into halves recursively, sorts them, and merges sorted halves. Time complexity is O of n log n for all cases, guaranteed, and space complexity is O of n for temporary arrays. It is stable, predictable, and good for large datasets, but requires extra space.
Quick sort selects a pivot, partitions array around it, and recursively sorts partitions. Time complexity is O of n log n average case, O of n squared worst case with bad pivot selection, and space complexity is O of log n for recursion stack. It is fastest in practice for most data and sorts in-place with less memory than merge sort, but not stable and worst case can be bad.
Heap sort builds a max heap and repeatedly extracts maximum to build sorted array. Time complexity is O of n log n for all cases and space complexity is O of 1. It guarantees O of n log n and uses no extra space, but slower than quick sort in practice and not stable.
Choose Quick Sort over Merge Sort when average-case performance matters more than worst-case, when memory is limited as quick sort uses O of log n space versus merge sort's O of n, when you need in-place sorting, when stability is not required, and for general-purpose sorting of random data. Choose Merge Sort when you need guaranteed O of n log n time complexity, when you need a stable sort preserving relative order of equal elements, when working with linked lists as merge sort works well without random access, and when extra space is available and acceptable.
Example code
// Quick Sort - O(n log n) average, O(n^2) worst
void quickSort(int[] arr, int low, int high) {
if (low < high) {
int pi = partition(arr, low, high);
quickSort(arr, low, pi - 1);
quickSort(arr, pi + 1, high);
}
}
int partition(int[] arr, int low, int high) {
int pivot = arr[high];
int i = low - 1;
for (int j = low; j < high; j++) {
if (arr[j] < pivot) {
i++;
swap(arr, i, j);
}
}
swap(arr, i + 1, high);
return i + 1;
}
// Merge Sort - O(n log n) always
void mergeSort(int[] arr, int left, int right) {
if (left < right) {
int mid = (left + right) / 2;
mergeSort(arr, left, mid);
mergeSort(arr, mid + 1, right);
merge(arr, left, mid, right);
}
}
void merge(int[] arr, int left, int mid, int right) {
int n1 = mid - left + 1;
int n2 = right - mid;
int[] L = new int[n1];
int[] R = new int[n2];
// Copy data and merge sorted arrays
// Requires O(n) extra space
}
// Choose Quick Sort: Random data, limited memory
// Choose Merge Sort: Need stability, guaranteed O(n log n)