# Computational complexity When we talk about the complexity of an algorithm, we generally refer to the worst-case scenario (Big O) and the best case scenario (Big Omega).
Tendency of an algorithm is dictated by its highest-order term.

Asymptotic notation – a family of notations invented by Paul Bachmann, Edmund Landau and others (Big O , Big Omega, Big Theta …)

O(1) Constant time
Asymptotically constant. Always takes a single operation in the worst case.

Example:
Accessing the value of a variable.

``````int four_for_you(int array)
{
return 4;
}

{
return a + b;
}
``````

O(log n) Logarithmic time
O(n) Linear time
Always takes n operations in the worst case.
Example:
– searching for a number in the array. Look every element to see if the number is there.
– counting the number of characters in a string from first to last one.

O(n log n) Linearithmic time
O(n^c) Polynomial time
O(c^n) Exponential time
O(n!) Factorial time

Runtime of a for loop

``````// O(m)
for (int j = 0; j  < m; j++){
// loop body that runs in O(1)
}

// O(p^2)
for (int j = 0; j  < p; j++){
for (int k = 0; k  < p; k++){
// loop body that runs in O(1)
}
}

``````

# Big O notation

Big O notation is a mathematical notation that describes the limiting behavior of a function when the argument tends towards a particular value or infinity.

Upper bound of steps.

Bubble sort, Selection sort and Insertion sort O(n^2)

# Ω Omega notation

Lower Bound notation

# θ Theta notation

When the best and worst cases are the same.
O = Ω

Example:
length of the string in a variable

# Algorithms

https://www.toptal.com/developers/sorting-algorithms

## Linear Search

Iterate across the array form left to right, searching for a specified element.

Pseudocode
Repeat starting at the first element
If the first element is what we are looking for, stop
Otherwise mowe to the next element

Big O We have to look through the entire array of n elements, either because the target element is the last element of the array or it doesn’t exist in the array at all.
Big Omega The target element is the first element of the array, and so we can stop looking immediately after we start.

``````linearSearch(key, array[]):

for (i = 0; i < length(array); i++):
if (array[i] == key):
return i

return -1

``````

O(n)
Omega(1)

## Binary Search

The idea of the algorithm is to divide and conquer, reducing the search area by half each time, trying to find a target number. The algorithm assumes that the array M is sorted in ascending order.

BinarySearch(key, array[], min_index, max_index)

Array size 7:
[1 2 3 ] (4) [5 6 7]
O(log 7 / log 2)

You are looking for an element x inside an array M, which contains n elements.

Pseudocode
Repeat until the (sub)array is of size 0:
calculate the middle point of the current (sub)array.
If the targer is at the missle, stop.
Otherwise, if the target is less than what’s at the middle, repeat, changing the end point to be just to the lift of the middle.
Otherwise, if the target is greater than what’s at the middle, repeat, changing the start point ro ve just to the right of the middle.

** In other words**
Set low to 0, high to n – 1.
If low > high, x does not exist in M and the algorithm terminates.
Set mid to (low + high) / 2.
If M[mid] < x, set low to mid + 1 and go to step 2.
If M[mid] > x, set high to mid – 1 and go to step 2.
M[mid] equals x and the algorithm terminates.

Code

``````binarySearch(key, array[], min, max):
if (max < min):
return -1
else:
midpoint = findMidPoint(min, max)

if (array[midpoint] < key):
binarySearch(key, array, midpoint + 1, max)
elif (array[midpoint] > key):
binarySearch(key, array, min, midpoint-1)
else
return midpoint
``````

Big O We have to divide a list of n elements in half repeatedly to find the target element, either because the target element will be found at the end of the last division or doesn’t exist in the array at all.
** Big Omega** The target element is at the midpoint of the full array, and so we can stop looking immediately after we start.

O(log n)
Omega(1)

### Binary Search Tree

Properties
– The left subtree of a node only contains values that are less than or equal to the node’s value
– The right subtree of a node only contains values that are greater than or wqual to the node’s value
– Both left and right subtrees of a node are also binary search trees. Minimum value: Always go to the left
Max value: always go to the right

Inorder traversal: going from the min value to the max by traversing the tree. ## Bubble Sort

Move higher valued elements generally towards the right and lower value elements towards left.

Pseudocode
Set swap counter to a non-zero value (eg. -1)
Repeat until the swap is 0:
Reset swap counter to 0
If two adjacent elements are not in order; swap them and add one to the swap counter

Notice that after the first pass the highest number is in its place.
Swap counter tells us when the array is sorted – whene there are no more swaps.

Big O The array is in reverse order; we have to “bubble” each of the n elements all the way across the array and since we can only fully bubble one element into position per pass we must fo this n times
Big Omega The array is already perfectly sorted and we make no swaps on the first pass.

O(n^2)
Omega(n)

Sum: compare two, swap if needes, compare next who, swap if needed.
End the sort when there are no more swaps.

## Selection sort (search for smallest)

Find the smallest unsorted element and add it to the end of the sorted list.

Pseudocode
Repeat until no unsorted elements remain
Search the unsorted part of data to find the smallest value
Swap the smallest found value with the first element of the unsorted part

Big O: we have to iterate over each of the n elements of the array (to find the smallest unsorted element) and we must repeat this process n times, since only one element gets sorted on each pass.
Big Omega: Exactly the same. We can’t say the array is sorted until we go through all elements.

(n-1) + (n-2) + … +1 = n(n-1) / 2 = (n^2 – n) / 2 = n^2/2 – n/2 = O(n^2)

Divide the list in sorted portion and unsorted portion

``````for i = 1 to n-1
min = i
for j = i + 1 to n
if array[j] < array[min]
min = j
if min != i
swap array[min and array[i]
``````

## Instertion sort

Build sorted array in place, shifting elements out of the way id necessary to make room as you go.

Pseudocode
Call the first element of the array sorted
Repeat untill all elements are sorted
Look at the next unsorted element. Insert into the sorted portion by shifting the requisite number of elements.

Big O: The array is in the reverse order; we have to shift each of the elements n positions each time we make an insertion
Big Omega: The array is already sorted and we simply keep moving the line between unsorted and sorted as we examine each element.

O(n^2)
Omega(n)

``````for i = 1 to n - 1
element = array[i]
j = i
while (j > 0 and array[j -1] > element)
array[j] = array[j-1]
j = j -1
array[j] = element
``````

Sum: Going through the array linearly. Sorting array one at the time. Move multiple elements if needed

O(n*log(n))

## Quicksort

O(n^2)

• Pick the last element in the array and call it a pivot

## Merge sort

The idea of the algorithm is to sort smaller arrays and then combine those arrays together (merge them) in sorted order.
Merge sort levarages recursion.

Recursive sort
O(n* log n)

** Pseudocode**
On input of n elements
if n < 2
return
else
sort left half of elements in the array
sort right half of elements in the array
merge sorted halves

if sentence runs in constant time T(n) = O(I) if n < 2
Sort left half of elements T(n/2)
Sort right half of elements T(n/2)
Merge O(n)

``````    T(n) = T(n/2) + T(n/2) + O(n) = O(n log n) if n >=2
``````

Big O We have to split n elements up and then recombine them, effectively doubling the sorted subarrays as we build them up.
** Big Omega** The array is already perfectly sorted. But we still have to split and recombine it back together .

O(n log n)
Omega(n log n)

Sources:
https://courses.edx.org/courses/course-v1:HarvardX+CS50+X/ Week 3