}
Quick overview of simple sorting algorithms. We cannot simple ignore them as the form the base of many algorithms.
Bubble Sort - After every pass the biggest* element is sifted to end. Lot of swaps since each element is compared with next element.
Insertion Sort - Similar to human arranging cards. Elements are inserted into the sorted array [0 to k-1] and swap happens from inserted element to A[k]. We can use binary search to find position of kth element in [0 to k-1] sorted array. However actual insertion may cause k-1 swaps.
Selection Sort - Minimizes number of swaps. For every iteration k, find the smallest element in A [k to k-1] and swap at the end of iteration.
Quick Sort - Uses divide-and-conquer strategy, below are the recursion steps
Counting or Bucket Sort
We can implement a simple counting sort using array[k] when sorting 1…k numbers. We need an ARRAY[k] to hold frequency of each number occurrence and then merge at end.
When to use Counting sort ? When k is smaller than n ( k<<n )
e.g. Suppose you want to sort 10000 people by birthday; n=10000, k=366, so time = O(n) and memory O(k).
Radix sort - How to efficiently sort a big list dates in 20’s
What to do when k is large ?
Think about the decimal representation of a number. For any number we need log(number) digits.
x = a + 10 b + 100 c + 1000 d + … where a, b, c etc all in range 0..9.
radix sort(L):
{
bucket sort by a; bucket sort by b; bucket sort by c
...
Sorting floating point numbers
Floating point represents numbers roughly in the form x = a * 2^b. If all numbers had same value of b, we could just bucket sort by a. Different b's complicate the picture but basically b is more important than a so sort by a first (radix sort) then one more bucket sort by b.
Stability - Any comparison based sorting algorithm can be made stable by modifying the comparisons to break ties according to the original positions of the objects, but few are automatically stable.