# Implementation of Sorting Algorithms in Java

This document is a list of common sorting algorithms and their implementations in Java. The algorithms have been written by myself. I have tested them with the JUnit tests you can find at the end of this article, however, I do not guarantee that the implementations are 100% correct. Feel free to comment below or write me an email if you find a mistake.

Click on an algorithm to let it appear below:

## Bubble sort

The idea behind bubble sort is very simple. We compare two adjacent elements, if the element on the left is bigger than the one on the right we swap the elements. One commonly says that you let the big elements bubble up. This means that after the first execution of the inner loop, the biggest element of the array is guaranteed to be in the rightmost position of the array. After the second execution the second biggest will be in the right place and so on. Executing the inner loop $$n$$ times guarantees us that we will have sorted the array. The runtimes are therefore as follows:

• Number of swaps: $$\mathcal{O}(n^2)$$
• Number of comparisons: $$\mathcal{O}(n^2)$$
static int[] bubbleSort(int[] array) {
int len = array.length;
for(int i = 0; i < len-1; i++) {
for(int j = 0; j < len-1; j++) {
if(array[j] > array[j+1]) {
int temp = array[j];
array[j] = array[j+1];
array[j+1] = temp;
}
}
}
return array;
}

## Selection sort

Selection sort is a very intuitive algorithm. We look all the numbers we are trying to sort and always pick the smallest one that we haven’t sorted. We then add it to the end of our sorted list. This means that our array gets sorted from left to right, the left part grows with each element we insert and the right part of the still unsorted items shrinks until we remove the last element from the right side, which will be the largest element of the whole array. TO get a better picture you can look at the code below. The runtimes are therefore as follows:

• Number of swaps: $$\mathcal{O}(n)$$, since we are only swapping once every new smallest element we are finding
• Number of comparisons: $$\mathcal{O}(n^2)$$, since we need to find the next smallest element within the remaining numbers
static int[] selectionSort(int[] array) {
for(int i = 0; i < array.length; i++) {
// temporary min to find the minimal element on right side
int min = Integer.MAX_VALUE;
// teporary variable to store index of minimal element on right side
int minIndex = 0;
// loop trough remaining element (right side to find min)
for(int j = i; j < array.length; j++) {
// if its smaller than the current smallest it is the new smallest
if(array[j] < min) {
minIndex = j;
min = array[j];
}
}
// we have found the smallest remaining element, we swap it with the element at the
// i-th position, increasing the size of the already sorted left array by one and
// shrinking the size of the unsorted array right by one
int temp = array[i];
array[i] =  array[minIndex];
array[minIndex] = temp;
}
// we return the sorted array
return array;
}

## Insertion Sort

Insertion sort works similar to selection sort from above. We loop trough the whole array, assume we are now at the $$i$$-th position in the array. Insertion sort will guarantee us that the subarray $$[0,i]$$ is already perfectly sorted. We now look at the $$i+1$$-th element. We would like to insert into our already sorted subarray.

To find an insertion point we can either use binary search or linear search (I have code examples for both below). Assume we can insert the $$i+1$$-th element at position $$k$$. We then move all element in the subarray $$[k,i]$$ up by one position and insert the element at $$i+1$$ at position $$k$$. We do this for every element in our array moving from left to right.

• Number of swaps: $$\mathcal{O}(n^2)$$, for every insertion we need to move part of our sorted subarray
• Number of comparisons: $$\mathcal{O}(n \log(n))$$, if we are using binary search to find the insertion index
/**
* sorts array using insertion sort
* @param arr the array we are trying to sort
* @return sorted array
*/
static int[] insertionSort(int[] arr) {
// we look at every position of our array and see where we can insert it
for(int i = 0; i < arr.length-1; i++) {
int toInsert = arr[i+1];
// find index where we can insert it
int insertionPoint = binarySearch(arr,arr[i+1],  0, i+1);
// logic that copies everything in already sorted subarray one position to the left
int[] toCopy = new int[i-insertionPoint+2];
for(int j =insertionPoint; j <= i;j++) {
toCopy[j-insertionPoint] = arr[j];
}

for(int j = insertionPoint; j <= i;j++) {
arr[j+1] = toCopy[j-insertionPoint];
}
arr[insertionPoint] = toInsert;
}
return arr;
}

/**
* finds the index where an object gets inserted into a subarray
* within an array specified with a left and right index
* @param arr the array in which we have the subarry
* @param object the object we are trying to insert into the array
* @param l the left bound of the subarray within the array
* @param r the right bound of the subarray within the array
* @return index of position where object can be inserted
*/
static int binarySearch(int[] arr,int object,  int l, int r) {
int ll = l;
int rr = r;
int middle = l+((r-l) / 2);
while(ll < rr) {
if(arr[middle] > object) {
rr = middle;
} else {
ll = middle+1;
}
middle = (ll + rr) / 2;
}
return ll;
}

/**
* finds the index where an object gets inserted into a subarray
* within an array specified with a left and right index
* @param arr the array in which we have the subarry
* @param object the object we are trying to insert into the array
* @param l the left bound of the subarray within the array
* @param r the right bound of the subarray within the array
* @return index of position where object can be inserted
*/
static int linearSearch(int[] arr,int object,  int l, int r) {
int pointer = l;
while(pointer <= r) {
if(arr[pointer]==object) {
return pointer;
}else if(arr[pointer]<object) {
pointer++;
}else { // arr[pointer]>object{
return pointer;
}
}
return pointer;
}

## Merge Sort

Merge sort is a recursive sorting algorithm. (Of course non-recursive versions also exist). Given an array of length $$n$$, we split it in half an apply the mergesort function to the two subarrays. At one point we will have just one element left in each subarray, this is where the merge aprt comes in. We take two adjancent arrays and merge them together, such that the resulting array will contain all the elements from the two subarrays in a sorted order.

The trick is, that the subarrays containing only one element are trivially already sorted, this allows us to use two pointers, as shown in the code below, when merging two subarrays, since the two subarrays we are merging are always sorted.

Although the algorithm seems to be very overengineered it is very efficient and as an added bonus can easily be parallelized. Calculating the runtime is a bit tricky and involves dealing with recurrences (which I will cover in a future post), but in case you are interested the reccurence relation for merge sort is $$T(n) = 2T(\frac{n}{2}) + \mathcal{O}(n) + \mathcal{O}(1)$$. This gives us:

• Number of swaps: $$\mathcal{O}(n \log(n))$$
• Number of comparisons: $$\mathcal{O}(n \log(n))$$
/**
* Wrapper function which then calls the recursive merge sort function
* @param arr - unsorted array
* @return sorted array
*/
static int[] mergeSort(int[] arr) {
return RecursiveMergeSort(arr, 0, arr.length-1);
}

/**
*
* @param arr containing the array we would like to sort
* @param l left endpoint of the array we want to sort within arr
* @param r right endpoint of the array we want to sort within arr
* @return array with sorted subarray
*/
static int[] RecursiveMergeSort(int[] arr, int l, int r) {
if(l != r) {
// find mid-point where we split the array into two subarrays
int m = (l+r) / 2;
// call merge sort on left subarray
RecursiveMergeSort(arr,l,m);
// call merge sort on right subarray
RecursiveMergeSort(arr,m+1,r);
// merge left and right subarrays
merge(arr,l,m,r);
}
return arr;
}

/**
* This function merges two sorted subarrays into a sorted array
* occupying the same space the two adjacent subarrays did.
* @param arr the array in which we have two subarrays
* @param l index where the left subarray begins
* @param m the middle index - where the left subarray stops and the right begins
* @param r the index where the right subarray ends
* @return array with sorted subarray
*/
static int[] merge(int[] arr, int l, int m, int r) {
// two pointers initialized to the beginning of the two subarrays.
int p1 = l;
int p2 = m+1;
//temporary array to store the merged array
int[] temp = new int[r-l+1];
// to check we do not access a position in temp that doesn't exist
int counter = 0;

//now we can merge
while(p1 <= m && p2 <= r) {
if(arr[p1] > arr[p2]) {
temp[counter] = arr[p2];
p2++;
}else {
temp[counter] = arr[p1];
p1++;
}
counter++;
}
// merge completed - check if there are items left in left subarray
while(p1 <= m) {
temp[counter] = arr[p1];
counter++;
p1++;
}
// merge completed - check if there are items left in right subarray
while(p2 <= r) {
temp[counter] = arr[p2];
counter++;
p2++;
}
// copy temp array back into the original array -> now we can overwrite
for(int i = 0; i < temp.length; i++) {
arr[l+i] = temp[i];
}
// return the array
return arr;
}

## Heap Sort

Heap sort is a rather complicated sorting algorithm. It first transforms the whole array we are trying to sort into a max-heap. (If you do not know what a max-heap is you can find you can read more here, on wikipedia). Then we remove the biggest element from our max heap and swap it with the element at the array. We then move the right bound of the max-heap one position to the left and restore the heap condition. We perform this, remove max, place in sorted subarray from end, move right bound, rebalance operation as long as there are elements in the max-heap. At the end, only the smallest element will have remained in the max-heap and we can abort our sorting as our array will have been sorted.

• Number of swaps: $$\mathcal{O}(n\log(n))$$
• Number of comparisons: $$\mathcal{O}(n \log(n))$$,
static int[] heapSort(int[] array) {
//first we transform the array into a heap:
for(int i = array.length/2; i >= 0; i--) {
restoreHeapCondition(array,i, array.length-1);
}
for(int rightPointer = array.length-1; rightPointer >= 0 ;rightPointer--) {
restoreHeapCondition(array, 0,rightPointer);

//first we swap
int temp = array[rightPointer];
array[rightPointer] = array;
array = temp;
}
return array;
}

static void restoreHeapCondition(int[] array,int pos, int rightPos) {
int leftChildIndex = pos*2+1;
int rightChildIndex = pos*2 + 2;
int largerIndex = pos;

//check if the position exists
if(leftChildIndex <= rightPos) {
if(array[leftChildIndex] > array[pos]) {
largerIndex = leftChildIndex;
}
}

//check if the position exists
if(rightChildIndex <= rightPos) {
if(array[rightChildIndex] > array[largerIndex]) {
largerIndex = rightChildIndex;
}
}

//check if we found a larger child
if(pos != largerIndex) {
// we need to swap
int temp = array[pos];
array[pos] = array[largerIndex];
array[largerIndex] = temp;
// since we have moved stuff we must make sure it didnt screw up something else in the "tree"
restoreHeapCondition(array,largerIndex,rightPos);
}
}

## Quick Sort

Quicksort is another non-trivial sorting algorithm. We look at the rightmost element in our array, we call it the pivot. We then initialize two pointers, one going from the left to the right through the array and the other from the right to the left. We let the pointers advance towards each other. If the element a pointer points to is smaller than the pivot for the pointer coming from the right we stop advancing, we do the same for the pointer coming from the left, but vice versa, this means we stop it when an element is bigger than the pivot. Once both pointers stopped moving, we swap the elements, we do this as long as the pointers don’t cross each other. If they cross we swap the larger element with our pivot.

Now we split our array into a subarray left of the position we have placed the pivot and a subarray to the right of the position of the pivot. We repeat the procedure above for the two subarrays. As soon as the remaining subarrays have size one, we stop the recursion.

This sounds very complex and isn’t a great decription, it just highlights the basic idea. If you want to find out more check out the quicksort article on wikipedia.

• Number of comparisons: $$\mathcal{O}(n^2)$$ in the worst case, however for $$\mathcal{O}(n \log(n))$$ for the average case.
static int[] quickSort(int[] arr) {
return recQuickSort(arr,0,arr.length-1);
}

static int[] recQuickSort(int[] arr, int l, int r) {
if(l < r) {
// partition the array
int p = partition(arr,l,r);
// recursively sort the two subarrays
recQuickSort(arr,l,p-1);
recQuickSort(arr,p+1,r);
}
return arr;
}

static int partition(int[] arr, int  l, int r) {
// get the pivot at the rightmost position
int pivot = arr[r];
// define left and right pointer
int ll = l;
int rl = r-1;
do {
// advance left pointer as long the element is smaller or equal our pivot
while(arr[ll] <= pivot && ll < r) {
ll++;
}
// advance left pointer as long the element is bigger or equal our pivot
while(arr[rl] >= pivot && rl > l) {
rl--;
}
// if they crossed we abort the loop
if(ll >= rl) {
break;
}
// they didn't cross -> we swap the elements at the positions
if(ll<rl) {
int temp = arr[ll];
arr[ll] = arr[rl];
arr[rl] = temp;
}
}while(ll < rl);
//swap pivot with element where leftpointer points to
arr[r] = arr[ll];
arr[ll] = pivot;
return ll;
}

## JUnit Tests

If you would like to test your sorting algorithms feel free to use the code below. There is already a test case below but the more test cases you add, the better tested your software is. Therefore I encourage you to write more test cases.

import static org.junit.jupiter.api.Assertions.*;

import java.util.Arrays;
import org.junit.jupiter.api.Test;

class TestSortingAlgorithms {

static int[] referenceSolution = new int[] {-21,-3,0,0,1,2,2,4,4,4,5,6,6,12,91,99};

@Test
void testMergeSort() {
int[] arrayToBeSorted = new int[] {5,2,99,12,6,4,4,91,-21,0,0,-3,1,6,4,2};
assertEquals(Arrays.toString(referenceSolution),
Arrays.toString(MergeSort.mergeSort(arrayToBeSorted)));
}
} 