Sorting algorithms

Sorting:

Understanding common sorting algorithms is very valuable. Many sorting or searching problems require a simple tweak of these known sorting algorithms. Here are all the important Sorting problems:

Bubble Sort:

We start at the beginning of an, array pass through the array, while swapping consecutive elements if not in the given order. Here is an example:

pass 1:

                                                  6,5,4,9,8,5 --> 5,6,4,9,8,5{Because 6 is greater than 5} 
                                                  5,6,4,9,8,5 --> 5,4,6,9,8,5
                                                  5,4,6,9,8,5 --> 5,4,6,9,8,5 
                                                  5,4,6,9,8,5 --> 5,4,6,8,9,5
                                                  5,4,6,8,9,5 --> 5,4,6,8,9,5
                                                  5,4,6,8,9,5 --> 5,4,6,8,5,9

pass 2:

                                                  5,4,6,8,5,9 --> 4,5,6,8,5,9
\[.\]
\[.\]
\[.\]

We do this until the array is sorted. The time complexity of this algorithm is obviously $O(n^2)$.

Selection Sort:

We start at the beginning of an array, pass through the array finding the minimum element, we put it at the first location. In the next round we find the second minimum, place it at the second location. So on and so forth. This we do till the array is not sorted. The time complexity is again $O(n^2)$

                                                  6,5,4,9,8,5 --> 4,5,6,9,8,5
                                                  4,5,6,9,8,5 --> 4,5,6,9,8,5
                                                  4,5,6,9,8,5 --> 4,5,5,9,8,6
                                                  4,5,5,9,8,6 --> 4,5,5,6,8,9
\[.\]
\[.\]
\[.\]

Merge Sort:

We divide the array into two halves, now we call merge sort on these two subarrays, merge them together and return the resultant array. As is visible, we use recursion here.

Here is an algorithm:

mergesort:
  • Divide the given array into two.
  • pass each into mergesort
  • merge these two arrays into one.
  • return this new array.
Here is an algorithm for merge operation:

merge:
  • create two new arrays of lengths equal to these subarrays. Copy the contents into these.
  • start iterating in two arrays, check for which element is shorter among the two and push the shorter into the original array.

Bucket Sort:

It is mainly used when input is uniformly distributed over a range. Here is the algorithm for bucketSort:
Suppose you want to sort a range of float points in the range 0 to 1.

bucketSort(arr[], n):

  • Create n empty buckets (you can use vectors for it)
  • For each element arr[i] in arr. Insert arr[i] into bucket number n*arr[i]
  • sort individual buckets
  • concatenate all sorted buckets.
The time complexity of step 1 is O(1), for step two it is, if chosen containers wisely, O(1).
The third step is O(n). We are, on an average, rearranging n elements. The last step is again O(n). Thus, the overall process is of O(n) complexity.


Quick Sort:

Quicksort is the most commonly used sorting algorithm, most sorting functions in libraries use quicksort.
In quicksort what we do is: Given an element we pick up an element in the array, usually the last element. Then rearrange the array such that all the elements smaller than this element are to its left and those larger are at its right, this process is in general called as doing a partition.

This is what partition does, We do it about 4 here:
\[partition({1,5,6,8,2,4}) ->{2,1,4,5,8,6}\]
The order of elements in the left and right subarrays do not matter/

We now apply quicksort on the subarrays, the one to the left of the pivot and the one to the right of the pivot. As can be seen, this is a recursion problem.


QuickSort(arr, start, end)

  • Define pivot as arr[end]
  • Get the location of the point of the pivot in resulting array, pivot=partition(arr, start, end)
  • quicksort(arr, start, pivot-1)
  • quicksort(arr, pivot+1, end)

Here is the algorithm for partition:

Partition(arr, start, end)

  • Define pivot = arr[end]
  • p_index = start //this is the position we will place pivot the at.
  • loop for i=start up to the end
  • ..... if arr[i]<pivot
  • ..........swap(arr[p_index], arr[i])
  • ...........p_index++
  • Once the loop is completed swap(arr[p_index], pivot).
Suppose we apply this partition on ${1,5,3,2,4}$

pivot = 4
p_index = 0

loop starts:
first time in the loop:
        1<pivot? Yes
             swap(arr[0], arr[0])
             p_index++
arr is now:
${1,5,3,2,4}$ and  p_index=1

second time through the loop:
         5<pivot? No


arr is now:
${1,5,3,2,4}$ and p_index=1

Third time through the loop:
          3<pivot? Yes
           swap(arr[2], arr[p_index])
           p_index ++


arr is now:
${1,3,5,2,4}$ and p_index=2


fourth time through the loop:
          2<pivot? Yes
          swap(arr[3], arr[p_index])
          p_index++

arr is now:
${1,3,2,5,4}$ and p_index=3

last time through the loop:
          4<pivot? No

arr is now:
${1,3,2,5,4}$ and p_index=3


Out of the loop:
swap(arr[end], arr[p_index])

arr is now:
${1,3,2,4,5}$



This is what the partition function does, now all the number smaller than the pivot are to the left of it and all bigger than it are to its right.

No comments:

Post a Comment

Installing albert on ubuntu 19.04

Installing Albert on Ubuntu 19.04... Albert is not still released for ubuntu 19.04. But still, you can install it using the following ...