fix(guide): simplify directory structure

This commit is contained in:
Mrugesh Mohapatra
2018-10-16 21:26:13 +05:30
parent f989c28c52
commit da0df12ab7
35752 changed files with 0 additions and 317652 deletions

View File

@ -0,0 +1,172 @@
---
title: Bubble Sort
---
## Bubble Sort
Bubble Sort is the simplest sorting algorithm that works by repeatedly swapping the adjacent elements if they are in wrong order.
This is a very slow sorting algorithm compared to algorithms like quicksort, with worst-case complexity O(n^2). However, the tradeoff is that bubble sort is one of the easiest sorting algorithms to implement from scratch.
### Example:
#### First Pass:
( 5 1 4 2 8 ) > ( 1 5 4 2 8 ), Here, algorithm compares the first two elements, and swaps since 5 > 1.
( 1 5 4 2 8 ) > ( 1 4 5 2 8 ), Swap since 5 > 4
( 1 4 5 2 8 ) > ( 1 4 2 5 8 ), Swap since 5 > 2
( 1 4 2 5 8 ) > ( 1 4 2 5 8 ), Now, since these elements are already in order (8 > 5), algorithm does not swap them.
#### Second Pass:
( 1 4 2 5 8 ) > ( 1 4 2 5 8 )
( 1 4 2 5 8 ) > ( 1 2 4 5 8 ), Swap since 4 > 2
( 1 2 4 5 8 ) > ( 1 2 4 5 8 )
( 1 2 4 5 8 ) > ( 1 2 4 5 8 )
Now, the array is already sorted, but our algorithm does not know if it is completed. The algorithm needs one whole pass without any swap to know it is sorted.
#### Third Pass:
( 1 2 4 5 8 ) > ( 1 2 4 5 8 )
( 1 2 4 5 8 ) > ( 1 2 4 5 8 )
( 1 2 4 5 8 ) > ( 1 2 4 5 8 )
( 1 2 4 5 8 ) > ( 1 2 4 5 8 )
#### Properties
- Space complexity: O(1)
- Best case performance: O(n)
- Average case performance: O(n\*n)
- Worst case performance: O(n\*n)
- Stable: Yes
### Video Explanation
[Bubble sort in easy way](https://www.youtube.com/watch?v=Jdtq5uKz-w4)
-----
### Example in JavaScript
```js
let arr = [1, 4, 7, 45, 7,43, 44, 25, 6, 4, 6, 9];
let sorted = false
while(!sorted) {
sorted = true
for(var i=0; i < arr.length; i++) {
if(arr[i] < arr[i-1]) {
let temp = arr[i];
arr[i] = arr[i-1];
arr[i-1] = temp;
sorted = false;
}
}
}
```
### Example in Java.
```java
public class bubble-sort {
static void sort(int[] arr) {
int n = arr.length;
int temp = 0;
for(int i=0; i < n; i++){
for(int x=1; x < (n-i); x++){
if(arr[x-1] > arr[x]){
temp = arr[x-1];
arr[x-1] = arr[x];
arr[x] = temp;
}
}
}
}
public static void main(String[] args) {
for(int i=0; i < 15; i++){
int arr[i] = (int)(Math.random() * 100 + 1);
}
System.out.println("array before sorting\n");
for(int i=0; i < arr.length; i++){
System.out.print(arr[i] + " ");
}
bubbleSort(arr);
System.out.println("\n array after sorting\n");
for(int i=0; i < arr.length; i++){
System.out.print(arr[i] + " ");
}
}
}
```
### Example in C++
```c++
// Recursive Implementation
void bubblesort(int arr[], int n)
{
if(n==1) //Initial Case
return;
bool swap_flag = false;
for(int i=0;i<n-1;i++) //After this pass the largest element will move to its desired location.
{
if(arr[i]>arr[i+1])
{
int temp=arr[i];
arr[i]=arr[i+1];
arr[i+1]=temp;
swap_flag = true;
}
}
// IF no two elements were swapped in the loop, then return, as array is sorted
if(swap_flag == false)
return;
bubblesort(arr,n-1); //Recursion for remaining array
}
```
### Example in Swift
```swift
func bubbleSort(_ inputArray: [Int]) -> [Int] {
guard inputArray.count > 1 else { return inputArray } // make sure our input array has more than 1 element
var numbers = inputArray // function arguments are constant by default in Swift, so we make a copy
for i in 0..<(numbers.count - 1) {
for j in 0..<(numbers.count - i - 1) {
if numbers[j] > numbers[j + 1] {
numbers.swapAt(j, j + 1)
}
}
}
return numbers // return the sorted array
}
```
### Example in Python
```py
def bubblesort( A ):
for i in range( len( A ) ):
for k in range( len( A ) - 1, i, -1 ):
if ( A[k] < A[k - 1] ):
swap( A, k, k - 1 )
def swap( A, x, y ):
tmp = A[x]
A[x] = A[y]
A[y] = tmp
```
### More Information
<!-- Please add any articles you think might be helpful to read before writing the article -->
- [Wikipedia](https://en.wikipedia.org/wiki/Bubble_sort)
- [Bubble Sort Algorithm - CS50](https://youtu.be/Ui97-_n5xjo)
- [Bubble Sort Algorithm - GeeksForGeeks (article)](http://www.geeksforgeeks.org/bubble-sort)
- [Bubble Sort Algorithm - MyCodeSchool (video)](https://www.youtube.com/watch?v=Jdtq5uKz-w4)
- [Algorithms: Bubble Sort - HackerRank (video)](https://www.youtube.com/watch?v=6Gv8vg0kcHc)
- [Bubble Sort Algorithm - GeeksForGeeks (video)](https://www.youtube.com/watch?v=nmhjrI-aW5o)
- [Bubble Sort Visualization](https://www.hackerearth.com/practice/algorithms/sorting/bubble-sort/visualize/)

View File

@ -0,0 +1,51 @@
---
title: Bucket Sort
---
## What is Bucket Sort ?
Bucket sort is a comparison sort algorithm that operates on elements by dividing them into different buckets and then sorting these buckets
individually. Each bucket is sorted individually using a separate sorting algorithm or by applying the bucket sort algorithm recursively.
Bucket sort is mainly useful when the input is uniformly distributed over a range.
## Assume one has the following problem in front of them:
One has been given a large array of floating point integers lying uniformly between the lower and upper bound. This array now needs to be
sorted. A simple way to solve this problem would be to use another sorting algorithm such as Merge sort, Heap Sort or Quick Sort. However,
these algorithms guarantee a best case time complexity of O(NlogN).
However, using bucket sort, the above task can be completed in O(N)time.
Let's have a closer look at it.
Consider one needs to create an array of lists, i.e of buckets. Elements now need to be inserted into these buckets on the basis of their properties.
Each of these buckets can then be sorted individually using Insertion Sort.
### Pseudo Code for Bucket Sort:
```
void bucketSort(float[] a,int n)
{
for(each floating integer 'x' in n)
{
insert x into bucket[n*x];
}
for(each bucket)
{
sort(bucket);
}
}
```
### More Information:
- [Wikipedia](https://en.wikipedia.org/wiki/Bucket_sort)
- [GeeksForGeeks](http://www.geeksforgeeks.org/bucket-sort-2/)

View File

@ -0,0 +1,56 @@
---
title: Counting Sort
---
## Counting Sort
Counting Sort is a sorting technique based on keys between a specific range. It works by counting the number of objects having distinct key values (kind of hashing). Then doing some arithmetic to calculate the position of each object in the output sequence.
### Example:
```
For simplicity, consider the data in the range 0 to 9.
Input data: 1, 4, 1, 2, 7, 5, 2
1) Take a count array to store the count of each unique object.
Index: 0 1 2 3 4 5 6 7 8 9
Count: 0 2 2 0 1 1 0 1 0 0
2) Modify the count array such that each element at each index
stores the sum of previous counts.
Index: 0 1 2 3 4 5 6 7 8 9
Count: 0 2 4 4 5 6 6 7 7 7
The modified count array indicates the position of each object in
the output sequence.
3) Output each object from the input sequence followed by
decreasing its count by 1.
Process the input data: 1, 4, 1, 2, 7, 5, 2. Position of 1 is 2.
Put data 1 at index 2 in output. Decrease count by 1 to place
next data 1 at an index 1 smaller than this index.
```
### Implementation
```js
let numbers = [1, 4, 1, 2, 7, 5, 2];
let count = [];
let i, z = 0;
let max = Math.max(...numbers);
// initialize counter
for (i = 0; i <= max; i++) {
count[i] = 0;
}
for (i=0; i < numbers.length; i++) {
count[numbers[i]]++;
}
for (i = 0; i <= max; i++) {
while (count[i]-- > 0) {
numbers[z++] = i;
}
}
// output sorted array
for (i=0; i < numbers.length; i++) {
console.log(numbers[i]);
}
```

View File

@ -0,0 +1,128 @@
---
title: Heapsort
---
## Heapsort
Heapsort is an efficient sorting algorithm based on the use of max/min heaps. A heap is a tree-based data structure that satisfies the heap property -- that is for a max heap, the key of any node is less than or equal to the key of its parent (if it has a parent). This property can be leveraged to access the maximum element in the heap in O(logn) time using the maxHeapify method. We perform this operation n times, each time moving the maximum element in the heap to the top of the heap and extracting it from the heap and into a sorted array. Thus, after n iterations we will have a sorted version of the input array. This algorithm runs in O(nlogn) time and O(1) additional space [O(n) including the space required to store the input data] since all operations are performed entirely in-place.
The est worst and average case time complecity of Heapsort is O(nlogn). Although heapsort has a better worse-case complexity than quicksort, a well-implemented quicksort runs faster in practice. This is a comparison-based algorithm so it can be used for nonnumerical data sets insofar as some relation (heap property) can be defined over the elements.
An implementation in Java is as shown below :
```java
import java.util.Arrays;
public class Heapsort {
public static void main(String[] args) {
//test array
Integer[] arr = {1, 4, 3, 2, 64, 3, 2, 4, 5, 5, 2, 12, 14, 5, 3, 0, -1};
String[] strarr = {"hope you find this helpful!", "wef", "rg", "q2rq2r", "avs", "erhijer0g", "ewofij", "gwe", "q", "random"};
arr = heapsort(arr);
strarr = heapsort(strarr);
System.out.println(Arrays.toString(arr));
System.out.println(Arrays.toString(strarr));
}
//O(nlogn) TIME, O(1) SPACE, NOT STABLE
public static <E extends Comparable<E>> E[] heapsort(E[] arr){
int heaplength = arr.length;
for(int i = arr.length/2; i>0;i--){
arr = maxheapify(arr, i, heaplength);
}
for(int i=arr.length-1;i>=0;i--){
E max = arr[0];
arr[0] = arr[i];
arr[i] = max;
heaplength--;
arr = maxheapify(arr, 1, heaplength);
}
return arr;
}
//Creates maxheap from array
public static <E extends Comparable<E>> E[] maxheapify(E[] arr, Integer node, Integer heaplength){
Integer left = node*2;
Integer right = node*2+1;
Integer largest = node;
if(left.compareTo(heaplength) <=0 && arr[left-1].compareTo(arr[node-1]) >= 0){
largest = left;
}
if(right.compareTo(heaplength) <= 0 && arr[right-1].compareTo(arr[largest-1]) >= 0){
largest = right;
}
if(largest != node){
E temp = arr[node-1];
arr[node-1] = arr[largest-1];
arr[largest-1] = temp;
maxheapify(arr, largest, heaplength);
}
return arr;
}
}
```
implementation in C++
```C++
#include <iostream>
using namespace std;
void heapify(int arr[], int n, int i)
{
int largest = i;
int l = 2*i + 1;
int r = 2*i + 2;
if (l < n && arr[l] > arr[largest])
largest = l;
if (r < n && arr[r] > arr[largest])
largest = r;
if (largest != i)
{
swap(arr[i], arr[largest]);
heapify(arr, n, largest);
}
}
void heapSort(int arr[], int n)
{
for (int i = n / 2 - 1; i >= 0; i--)
heapify(arr, n, i);
for (int i=n-1; i>=0; i--)
{
swap(arr[0], arr[i]);
heapify(arr, i, 0);
}
}
void printArray(int arr[], int n)
{
for (int i=0; i<n; ++i)
cout << arr[i] << " ";
cout << "\n";
}
int main()
{
int arr[] = {12, 11, 13, 5, 6, 7};
int n = sizeof(arr)/sizeof(arr[0]);
heapSort(arr, n);
cout << "Sorted array is \n";
printArray(arr, n);
}
```
### Visualization
* <a href='https://www.cs.usfca.edu/~galles/visualization/HeapSort.html'>USFCA</a>
* <a href='https://www.hackerearth.com/practice/algorithms/sorting/heap-sort/tutorial/'>HackerEarth</a>
#### More Information:
- <a href='https://en.wikipedia.org/wiki/Quicksort' target='_blank' rel='nofollow'>Wikipedia</a>

View File

@ -0,0 +1,55 @@
---
title: Sorting Algorithms
---
## Sorting Algorithms
Sorting algorithms are a set of instructions that take an array or list as an input and arrange the items into a particular order.
Sorts are most commonly in numerical or a form of alphabetical (called lexicographical) order, and can be in ascending (A-Z, 0-9) or descending (Z-A, 9-0) order.
### Why Sorting Algorithms are Important
Since sorting can often reduce the complexity of a problem, it is an important algorithm in Computer Science. These algorithms have direct applications in searching algorithms, database algorithms, divide and conquer methods, data structure algorithms, and many more.
### Some Common Sorting Algorithms
Some of the most common sorting algorithms are:
* Selection Sort
* Bubble Sort
* Insertion Sort
* Merge Sort
* Quick Sort
* Heap Sort
* Counting Sort
* Radix Sort
* Bucket Sort
### Classification of Sorting Algorithm
Sorting algorithms can be categorized based on the following parameters:
1. Based on Number of Swaps or Inversion
This is the number of times the algorithm swaps elements to sort the input. `Selection Sort` requires the minimum number of swaps.
2. Based on Number of Comparisons
This is the number of times the algorithm compares elements to sort the input. Using <a href='https://guide.freecodecamp.org/computer-science/notation/big-o-notation/' target='_blank' rel='nofollow'>Big-O notation</a>, the sorting algorithm examples listed above require at least `O(nlogn)` comparisons in the best case and `O(n^2)` comparisons in the worst case for most of the outputs.
3. Based on Recursion or Non-Recursion
Some sorting algorithms, such as `Quick Sort`, use recursive techniques to sort the input. Other sorting algorithms, such as `Selection Sort` or `Insertion Sort`, use non-recursive techniques. Finally, some sorting algorithm, such as `Merge Sort`, make use of both recursive as well as non-recursive techniques to sort the input.
4. Based on Stability
Sorting algorithms are said to be `stable` if the algorithm maintains the relative order of elements with equal keys. In other words, two equivalent elements remain in the same order in the sorted output as they were in the input.
* `Insertion sort`, `Merge Sort`, and `Bubble Sort` are stable
* `Heap Sort` and `Quick Sort` are not stable
5. Based on Extra Space Requirement
Sorting algorithms are said to be `in place` if they require a constant `O(1)` extra space for sorting.
* `Insertion sort` and `Quick-sort` are `in place` sort as we move the elements about the pivot and do not actually use a separate array which is NOT the case in merge sort where the size of the input must be allocated beforehand to store the output during the sort.
* `Merge Sort` is an example of `out place` sort as it require extra memory space for it's operations.
### Best possible time complexity for any comparison based sorting
Any comparison based sorting algorithm must make at least nLog2n comparisons to sort the input array, and Heapsort and merge sort are asymptotically optimal comparison sorts.This can be easily proved by drawing the desicion tree diagram.
### Algorithmic Paradigm
Merge Sort and Quick Sort are based on Divide and Conquer Algorithm

View File

@ -0,0 +1,182 @@
---
title: Insertion Sort
---
## Insertion Sort
Insertion sort is the simplest and efficient sorting algorithm for small number of elements.
### Example:
In Insertion sort, you compare the `key` element with the previous elements. If the previous elements are greater than the `key` element, then you move the previous element to the next position.
Start from index 1 to size of the input array.
[ 8 3 5 1 4 2 ]
Step 1 :
![[ 8 3 5 1 4 2 ]](https://github.com/blulion/freecodecamp-resource/blob/master/insertion_sort/1.png?raw=true)
```
key = 3 //starting from 1st index.
Here `key` will be compared with the previous elements.
In this case, `key` is compared with 8. since 8 > 3, move the element 8
to the next position and insert `key` to the previous position.
Result: [ 3 8 5 1 4 2 ]
```
Step 2 :
![[ 3 8 5 1 4 2 ]](https://github.com/blulion/freecodecamp-resource/blob/master/insertion_sort/2.png?raw=true)
```
key = 5 //2nd index
8 > 5 //move 8 to 2nd index and insert 5 to the 1st index.
Result: [ 3 5 8 1 4 2 ]
```
Step 3 :
![[ 3 5 8 1 4 2 ]](https://github.com/blulion/freecodecamp-resource/blob/master/insertion_sort/3.png?raw=true)
```
key = 1 //3rd index
8 > 1 => [ 3 5 1 8 4 2 ]
5 > 1 => [ 3 1 5 8 4 2 ]
3 > 1 => [ 1 3 5 8 4 2 ]
Result: [ 1 3 5 8 4 2 ]
```
Step 4 :
![[ 1 3 5 8 4 2 ]](https://github.com/blulion/freecodecamp-resource/blob/master/insertion_sort/4.png?raw=true)
```
key = 4 //4th index
8 > 4 => [ 1 3 5 4 8 2 ]
5 > 4 => [ 1 3 4 5 8 2 ]
3 > 4 ≠> stop
Result: [ 1 3 4 5 8 2 ]
```
Step 5 :
![[ 1 3 4 5 8 2 ]](https://github.com/blulion/freecodecamp-resource/blob/master/insertion_sort/5.png?raw=true)
```
key = 2 //5th index
8 > 2 => [ 1 3 4 5 2 8 ]
5 > 2 => [ 1 3 4 2 5 8 ]
4 > 2 => [ 1 3 2 4 5 8 ]
3 > 2 => [ 1 2 3 4 5 8 ]
1 > 2 ≠> stop
Result: [1 2 3 4 5 8]
```
![[ 1 2 3 4 5 8 ]](https://github.com/blulion/freecodecamp-resource/blob/master/insertion_sort/6.png?raw=true)
The below algorithm is slightly optimized version to avoid swapping `key` element in every iteration. Here, the `key` element will be swapped at the end of the iteration (step).
```Algorithm
InsertionSort(arr[])
for j = 1 to arr.length
key = arr[j]
i = j - 1
while i > 0 and arr[i] > key
arr[i+1] = arr[i]
i = i - 1
arr[i+1] = key
```
Here is a detaied implementation in Javascript:
```
function insertion_sort(A) {
var len = array_length(A);
var i = 1;
while (i < len) {
var x = A[i];
var j = i - 1;
while (j >= 0 && A[j] > x) {
A[j + 1] = A[j];
j = j - 1;
}
A[j+1] = x;
i = i + 1;
}
}
```
A quick implementation in Swift is as shown below :
```swift
var array = [8, 3, 5, 1, 4, 2]
func insertionSort(array:inout Array<Int>) -> Array<Int>{
for j in 0..<array.count {
let key = array[j]
var i = j-1
while (i > 0 && array[i] > key){
array[i+1] = array[i]
i = i-1
}
array[i+1] = key
}
return array
}
```
The Java example is shown below:
```
public int[] insertionSort(int[] arr)
for (j = 1; j < arr.length; j++) {
int key = arr[j]
int i = j - 1
while (i > 0 and arr[i] > key) {
arr[i+1] = arr[i]
i -= 1
}
arr[i+1] = key
}
return arr;
```
### insertion sort in c....
```C
void insertionSort(int arr[], int n)
{
int i, key, j;
for (i = 1; i < n; i++)
{
key = arr[i];
j = i-1;
while (j >= 0 && arr[j] > key)
{
arr[j+1] = arr[j];
j = j-1;
}
arr[j+1] = key;
}
}
```
### Properties:
* Space Complexity: O(1)
* Time Complexity: O(n), O(n* n), O(n* n) for Best, Average, Worst cases respectively
* Sorting In Place: Yes
* Stable: Yes
#### Other Resources:
- [Wikipedia](https://en.wikipedia.org/wiki/Insertion_sort)
- [CS50 - YouTube](https://youtu.be/TwGb6ohsvUU)
- [SortInsertion - GeeksforGeeks, YouTube](https://www.youtube.com/watch?v=wObxd4Kx8sE)
- [Insertion Sort Visualization](https://www.hackerearth.com/practice/algorithms/sorting/insertion-sort/visualize/)
- [Insertion Sort - MyCodeSchool](https://www.youtube.com/watch?v=i-SKeOcBwko)

View File

@ -0,0 +1,254 @@
---
title: Merge Sort
---
## Merge Sort
Merge Sort is a <a href='https://guide.freecodecamp.org/algorithms/divide-and-conquer-algorithms' target='_blank' rel='nofollow'>Divide and Conquer</a> algorithm. It divides input array in two halves, calls itself for the two halves and then merges the two sorted halves. The major portion of the algorithm is given two sorted arrays, we have to merge them into a single sorted array. There is something known as the <a href='http://www.geeksforgeeks.org/merge-two-sorted-arrays/' target='_blank' rel='nofollow'>Two Finger Algorithm</a> that helps us merge two sorted arrays together. Using this subroutine and calling the merge sort function on the array halves recursively will give us the final sorted array we are looking for.
Since this is a recursion based algorithm, we have a recurrence relation for it. A recurrence relation is simply a way of representing a problem in terms of its subproblems.
``` T(n) = 2 * T(n / 2) + O(n) ```
Putting it in plain english, we break down the subproblem into two parts at every step and we have some linear amount of work that we have to do for merging the two sorted halves together at each step.
```
T(n) = 2T(n/2) + n
= 2(2T(n/4) + n/2) + n
= 4T(n/4) + n + n
= 4(2T(n/8) + n/4) + n + n
= 8T(n/8) + n + n + n
= nT(n/n) + n + ... + n + n + n
= n + n + ... + n + n + n
```
Counting the number of repetitions of n in the sum at the end, we see that there are lg n + 1 of them. Thus the running time is n(lg n + 1) = n lg n + n. We observe that n lg n + n < n lg n + n lg n = 2n lg n for n>0, so the running time is O(n lg n).
```Algorithm
MergeSort(arr[], left, right):
If right > l:
1. Find the middle point to divide the array into two halves:
mid = (left+right)/2
2. Call mergeSort for first half:
Call mergeSort(arr, left, mid)
3. Call mergeSort for second half:
Call mergeSort(arr, mid+1, right)
4. Merge the two halves sorted in step 2 and 3:
Call merge(arr, left, mid, right)
```
![Merge Sort Algorithm](https://upload.wikimedia.org/wikipedia/commons/thumb/e/e6/Merge_sort_algorithm_diagram.svg/300px-Merge_sort_algorithm_diagram.svg.png)
### Properties:
* Space Complexity: O(n)
* Time Complexity: O(n*log(n)). The time complexity for the Merge Sort might not be obvious from the first glance. The log(n) factor that comes in is because of the recurrence relation we have mentioned before.
* Sorting In Place: No in a typical implementation
* Stable: Yes
* Parallelizable :yes (Several parallel variants are discussed in the third edition of Cormen, Leiserson, Rivest, and Stein's Introduction to Algorithms.)
### Visualization:
* <a href='https://www.cs.usfca.edu/~galles/visualization/ComparisonSort.html'>USFCA</a>
* <a href='https://www.hackerearth.com/practice/algorithms/sorting/merge-sort/visualize/'>HackerEarth</a>
### Relavant videos on freeCodeCamp YouTube channel
* <a href='https://youtu.be/TzeBrDU-JaY'>Merge Sort algorithm - MyCodeSchool</a>
### Other Resources:
* <a href='https://en.wikipedia.org/wiki/Merge_sort' target='_blank' rel='nofollow'>Wikipedia</a>
* <a href='www.geeksforgeeks.org/merge-sort' target='_blank' rel='nofollow'>GeeksForGeeks</a>
* <a href='https://youtu.be/sWtYJv_YXbo' target='_blank' rel='nofollow'>Merge Sort - CS50</a>
### Implementaion in JS
```js
const list = [23, 4, 42, 15, 16, 8, 3]
const mergeSort = (list) =>{
if(list.length <= 1) return list;
const middle = list.length / 2 ;
const left = list.slice(0, middle);
const right = list.slice(middle, list.length);
return merge(mergeSort(left), mergeSort(right));
}
const merge = (left, right) => {
var result = [];
while(left.length || right.length) {
if(left.length && right.length) {
if(left[0] < right[0]) {
result.push(left.shift())
} else {
result.push(right.shift())
}
} else if(left.length) {
result.push(left.shift())
} else {
result.push(right.shift())
}
}
return result;
}
console.log(mergeSort(list)) // [ 3, 4, 8, 15, 16, 23, 42 ]
```
### Implementation in C
```C
#include<stdlib.h>
#include<stdio.h>
void merge(int arr[], int l, int m, int r)
{
int i, j, k;
int n1 = m - l + 1;
int n2 = r - m;
int L[n1], R[n2];
for (i = 0; i < n1; i++)
L[i] = arr[l + i];
for (j = 0; j < n2; j++)
R[j] = arr[m + 1+ j];
i = 0;
j = 0;
k = l;
while (i < n1 && j < n2)
{
if (L[i] <= R[j])
{
arr[k] = L[i];
i++;
}
else
{
arr[k] = R[j];
j++;
}
k++;
}
while (i < n1)
{
arr[k] = L[i];
i++;
k++;
}
while (j < n2)
{
arr[k] = R[j];
j++;
k++;
}
}
void mergeSort(int arr[], int l, int r)
{
if (l < r)
{
int m = l+(r-l)/2;
mergeSort(arr, l, m);
mergeSort(arr, m+1, r);
merge(arr, l, m, r);
}
}
void printArray(int A[], int size)
{
int i;
for (i=0; i < size; i++)
printf("%d ", A[i]);
printf("\n");
}
int main()
{
int arr[] = {12, 11, 13, 5, 6, 7};
int arr_size = sizeof(arr)/sizeof(arr[0]);
printf("Given array is \n");
printArray(arr, arr_size);
mergeSort(arr, 0, arr_size - 1);
printf("\nSorted array is \n");
printArray(arr, arr_size);
return 0;
```
### Implementation in C++
Let us consider array A = {2,5,7,8,9,12,13}
and array B = {3,5,6,9,15} and we want array C to be in ascending order as well.
```c++
void mergesort(int A[],int size_a,int B[],int size_b,int C[])
{
int token_a,token_b,token_c;
for(token_a=0, token_b=0, token_c=0; token_a<size_a && token_b<size_b; )
{
if(A[token_a]<=B[token_b])
C[token_c++]=A[token_a++];
else
C[token_c++]=B[token_b++];
}
if(token_a<size_a)
{
while(token_a<size_a)
C[token_c++]=A[token_a++];
}
else
{
while(token_b<size_b)
C[token_c++]=B[token_b++];
}
}
```
### Implementation in Python
```python
temp = None
def merge(arr, left, right):
global temp, inversions
mid = (left + right) // 2
for i in range(left, right + 1):
temp[i] = arr[i]
k, L, R = left, left, mid + 1
while L <= mid and R <= right:
if temp[L] <= temp[R]:
arr[k] = temp[L]
L += 1
else:
arr[k] = temp[R]
R += 1
k += 1
while L <= mid:
arr[k] = temp[L]
L += 1
k += 1
while R <= right:
arr[k] = temp[R]
R += 1
k += 1
def merge_sort(arr, left, right):
if left >= right:
return
mid = (left + right) // 2
merge_sort(arr, left, mid)
merge_sort(arr, mid + 1, right)
merge(arr, left, right)
arr = [1,6,3,1,8,4,2,9,3]
temp = [None for _ in range(len(arr))]
merge_sort(arr, 0, len(arr) - 1)
print(arr, inversions)
```

View File

@ -0,0 +1,144 @@
---
title: Quick Sort
---
## Quick Sort
Quick sort is an efficient divide and conquer sorting algorithm. Average case time complexity of Quick Sort is O(nlog(n)) with worst case time complexity being O(n^2).
The steps involved in Quick Sort are:
- Choose an element to serve as a pivot, in this case, the last element of the array is the pivot.
- Partitioning: Sort the array in such a manner that all elements less than the pivot are to the left, and all elements greater than the pivot are to the right.
- Call Quicksort recursively, taking into account the previous pivot to properly subdivide the left and right arrays. (A more detailed explanation can be found in the comments below)
A quick implementation in JavaScript:
```javascript
const arr = [6, 2, 5, 3, 8, 7, 1, 4]
const quickSort = (arr, start, end) => {
if(start < end) {
// You can learn about how the pivot value is derived in the comments below
let pivot = partition(arr, start, end)
// Make sure to read the below comments to understand why pivot - 1 and pivot + 1 are used
// These are the recursive calls to quickSort
quickSort(arr, start, pivot - 1)
quickSort(arr, pivot + 1, end)
}
}
const partition = (arr, start, end) => {
let pivot = end
// Set i to start - 1 so that it can access the first index in the event that the value at arr[0] is greater than arr[pivot]
// Succeeding comments will expound upon the above comment
let i = start - 1
let j = start
// Increment j up to the index preceding the pivot
while (j < pivot) {
// If the value is greater than the pivot increment j
if (arr[j] > arr[pivot]) {
j++
}
// When the value at arr[j] is less than the pivot:
// increment i (arr[i] will be a value greater than arr[pivot]) and swap the value at arr[i] and arr[j]
else {
i++
swap(arr, j, i)
j++
}
}
//The value at arr[i + 1] will be greater than the value of arr[pivot]
swap(arr, i + 1, pivot)
//You return i + 1, as the values to the left of it are less than arr[i+1], and values to the right are greater than arr[i + 1]
// As such, when the recursive quicksorts are called, the new sub arrays will not include this the previously used pivot value
return i + 1
}
const swap = (arr, firstIndex, secondIndex) => {
let temp = arr[firstIndex]
arr[firstIndex] = arr[secondIndex]
arr[secondIndex] = temp
}
quickSort(arr, 0, arr.length - 1)
console.log(arr)
```
A quick sort implementation in C
```C
#include<stdio.h>
void swap(int* a, int* b)
{
int t = *a;
*a = *b;
*b = t;
}
int partition (int arr[], int low, int high)
{
int pivot = arr[high];
int i = (low - 1);
for (int j = low; j <= high- 1; j++)
{
if (arr[j] <= pivot)
{
i++;
swap(&arr[i], &arr[j]);
}
}
swap(&arr[i + 1], &arr[high]);
return (i + 1);
}
void quickSort(int arr[], int low, int high)
{
if (low < high)
{
int pi = partition(arr, low, high);
quickSort(arr, low, pi - 1);
quickSort(arr, pi + 1, high);
}
}
void printArray(int arr[], int size)
{
int i;
for (i=0; i < size; i++)
printf("%d ", arr[i]);
printf("n");
}
int main()
{
int arr[] = {10, 7, 8, 9, 1, 5};
int n = sizeof(arr)/sizeof(arr[0]);
quickSort(arr, 0, n-1);
printf("Sorted array: n");
printArray(arr, n);
return 0;
}
```
The space complexity of quick sort is O(n). This is an improvement over other divide and conquer sorting algorithms, which take O(nlong(n)) space. Quick sort achieves this by changing the order of elements within the given array. Compare this with the <a href='https://guide.freecodecamp.org/algorithms/sorting-algorithms/merge-sort' target='_blank' rel='nofollow'>merge sort</a> algorithm which creates 2 arrays, each length n/2, in each function call.
#### More Information:
- <a href='https://en.wikipedia.org/wiki/Quicksort' target='_blank' rel='nofollow'>Wikipedia</a>
- <a href='http://www.geeksforgeeks.org/quick-sort' target='_blank' rel='nofollow'>GeeksForGeeks</a>
- <a href='https://www.youtube.com/watch?v=MZaf_9IZCrc' target='_blank' rel='nofollow'>Youtube: A Visual Explanation of Quicksort</a>
- <a href='https://www.youtube.com/watch?v=SLauY6PpjW4' target='_blank' rel='nofollow'>Youtube: Gayle Laakmann McDowell (author of Cracking The Coding Interview) explains the basics of quicksort and show some implementations</a>
- <a href='https://www.youtube.com/watch?v=COk73cpQbFQ' target='_blank' rel='nofollow'>Quick Sort - MyCodeSchool</a>

View File

@ -0,0 +1,120 @@
---
title: Radix Sort
---
## Radix Sort
Prerequisite: Counting Sort
QuickSort, MergeSort, HeapSort are comparison based sorting algorithms.
CountSort is not comparison based algorithm. It has the complexity of O(n+k), where k is the maximum element of the input array.
So, if k is O(n) ,CountSort becomes linear sorting, which is better than comparison based sorting algorithms that have O(nlogn) time complexity.
The idea is to extend the CountSort algorithm to get a better time complexity when k goes O(n2).
Here comes the idea of Radix Sort.
Algorithm:
For each digit i where i varies from the least significant digit to the most significant digit of a number
Sort input array using countsort algorithm according to ith digit.
We used count sort because it is a stable sort.
Example: Assume the input array is:
10,21,17,34,44,11,654,123
Based on the algorithm, we will sort the input array according to the one's digit (least significant digit).
0: 10 </br>
1: 21 11</br>
2:</br>
3: 123</br>
4: 34 44 654</br>
5:</br>
6:</br>
7: 17</br>
8:</br>
9:</br>
So, the array becomes 10,21,11,123,24,44,654,17
Now, we'll sort according to the ten's digit:
0:</br>
1: 10 11 17</br>
2: 21 123</br>
3: 34</br>
4: 44</br>
5: 654</br>
6:</br>
7:</br>
8:</br>
9:
Now, the array becomes : 10,11,17,21,123,34,44,654
Finally , we sort according to the hundred's digit (most significant digit):
0: 010 011 017 021 034 044</br>
1: 123</br>
2:</br>
3:</br>
4:</br>
5:</br>
6: 654</br>
7:</br>
8:</br>
9:
The array becomes : 10,11,17,21,34,44,123,654 which is sorted. This is how our algorithm works.
An implementation in C:
```
void countsort(int arr[],int n,int place){
int i,freq[range]={0}; //range for integers is 10 as digits range from 0-9
int output[n];
for(i=0;i<n;i++)
freq[(arr[i]/place)%range]++;
for(i=1;i<range;i++)
freq[i]+=freq[i-1];
for(i=n-1;i>=0;i--){
output[freq[(arr[i]/place)%range]-1]=arr[i];
freq[(arr[i]/place)%range]--;
}
for(i=0;i<n;i++)
arr[i]=output[i];
}
void radixsort(ll arr[],int n,int maxx){ //maxx is the maximum element in the array
int mul=1;
while(maxx){
countsort(arr,n,mul);
mul*=10;
maxx/=10;
}
}
```
### More Information:
- [Wikipedia](https://en.wikipedia.org/wiki/Radix_sort)
- [GeeksForGeeks](http://www.geeksforgeeks.org/radix-sort/)

View File

@ -0,0 +1,96 @@
---
title: Selection Sort
---
## Selection Sort
Selection Sort is one of the most simple sorting algorithms. It works in the following way,
1. Find the smallest element. Swap it with the first element.
2. Find the second smallest element. Swap it with the second element.
3. Find the third smallest element. Swap it with the third element.
4. Repeat finding the next smallest element and swapping it into the corresponding correct position till the array is sorted.
As you can guess, this algorithm is called Selection Sort because it repeatedly selects the next smallest element and swaps it into its place.
But, how would you write the code for finding the index of the second smallest value in an array?
* An easy way is to notice that the smallest value has already been swapped into index 0, so the problem reduces to finding the smallest element in the array starting at index 1.
### Implementation in C/C++
```C
for(int i = 0; i < n; i++)
{
int min_index = i;
int min_element = a[i];
for(int j = i +1; j < n; j++)
{
if(a[j] < min_element)
{
min_element = a[j];
min_index = j;
}
}
swap(&a[i], &a[min_index]);
}
```
### Implementation in Javascript
``` Javascript
function selection_sort(A) {
var len = array_length(A);
for (var i = 0; i < len - 1; i = i + 1) {
var j_min = i;
for (var j = i + 1; j < len; j = j + 1) {
if (A[j] < A[j_min]) {
j_min = j;
} else {}
}
if (j_min !== i) {
swap(A, i, j_min);
} else {}
}
}
function swap(A, x, y) {
var temp = A[x];
A[x] = A[y];
A[y] = temp;
}
```
### Implementation in Python
```python
def seletion_sort(arr):
if not arr:
return arr
for i in range(len(arr)):
min_i = i
for j in range(i + 1, len(arr)):
if arr[j] < arr[min_i]:
min_i = j
arr[i], arr[min_i] = arr[min_i], arr[i]
```
### Properties
* Space Complexity: <b>O(n)</b>
* Time Complexity: <b>O(n<sup>2</sup>)</b>
* Sorting in Place: <b>Yes</b>
* Stable: <b>No</b>
### Visualization
* [USFCA](https://www.cs.usfca.edu/~galles/visualization/ComparisonSort.html)
* [HackerEarth](https://www.hackerearth.com/practice/algorithms/sorting/selection-sort/visualize/)
### References
* [Wikipedia](https://en.wikipedia.org/wiki/Selection_sort)
* [KhanAcademy](https://www.khanacademy.org/computing/computer-science/algorithms#sorting-algorithms)
* [MyCodeSchool](https://www.youtube.com/watch?v=GUDLRan2DWM)

View File

@ -0,0 +1,111 @@
---
title: Timsort
---
## Timsort
Timsort is a fast sorting algorithm working at stable O(N log(N)) complexity
Timsort is a blend on Insertion Sort and Mergesort. This algorithm is implemented in Javas Arrays.sort() as well as Pythons sorted() and sort()
The smaller part are sorted using Insertion Sort and is later merged together using Mergesort.
A quick implementation in Python:
```
def binary_search(the_array, item, start, end):
if start == end:
if the_array[start] > item:
return start
else:
return start + 1
if start > end:
return start
mid = round((start + end)/ 2)
if the_array[mid] < item:
return binary_search(the_array, item, mid + 1, end)
elif the_array[mid] > item:
return binary_search(the_array, item, start, mid - 1)
else:
return mid
"""
Insertion sort that timsort uses if the array size is small or if
the size of the "run" is small
"""
def insertion_sort(the_array):
l = len(the_array)
for index in range(1, l):
value = the_array[index]
pos = binary_search(the_array, value, 0, index - 1)
the_array = the_array[:pos] + [value] + the_array[pos:index] + the_array[index+1:]
return the_array
def merge(left, right):
"""Takes two sorted lists and returns a single sorted list by comparing the
elements one at a time.
[1, 2, 3, 4, 5, 6]
"""
if not left:
return right
if not right:
return left
if left[0] < right[0]:
return [left[0]] + merge(left[1:], right)
return [right[0]] + merge(left, right[1:])
def timsort(the_array):
runs, sorted_runs = [], []
length = len(the_array)
new_run = [the_array[0]]
# for every i in the range of 1 to length of array
for i in range(1, length):
# if i is at the end of the list
if i == length - 1:
new_run.append(the_array[i])
runs.append(new_run)
break
# if the i'th element of the array is less than the one before it
if the_array[i] < the_array[i-1]:
# if new_run is set to None (NULL)
if not new_run:
runs.append([the_array[i]])
new_run.append(the_array[i])
else:
runs.append(new_run)
new_run = []
# else if its equal to or more than
else:
new_run.append(the_array[i])
# for every item in runs, append it using insertion sort
for item in runs:
sorted_runs.append(insertion_sort(item))
# for every run in sorted_runs, merge them
sorted_array = []
for run in sorted_runs:
sorted_array = merge(sorted_array, run)
print(sorted_array)
timsort([2, 3, 1, 5, 6, 7])
```
#### Complexity:
Tim sort has a stable Complexity of O(N log(N)) and compares really well with Quicksort.
A comaprison of complexities can be found on this [chart](https://cdn-images-1.medium.com/max/1600/1*1CkG3c4mZGswDShAV9eHbQ.png)
#### More Information:
- <a href='https://en.wikipedia.org/wiki/Timsort' target='_blank' rel='nofollow'>Wikipedia</a>
- <a href='https://www.geeksforgeeks.org/timsort/' target='_blank' rel='nofollow'>GeeksForGeeks</a>
- <a href='https://www.youtube.com/watch?v=jVXsjswWo44' target='_blank' rel='nofollow'>Youtube: A Visual Explanation of Quicksort</a>
#### Credits:
[Python Implementation](https://hackernoon.com/timsort-the-fastest-sorting-algorithm-youve-never-heard-of-36b28417f399)