Merge Sort is a divide and conquer algorithm. It divides an input array in half recursively.

Time Complexity

Merging n elements take O(N) time so total time complexity takes in time O(N * logN) where N is the size of input.

Implementation

def merge_sort(array)
    # base case
    if array.size <= 1
        return array
    end

    # find a mid
    mid = array.size / 2
    
    left  = merge_sort(array[0..mid-1])
    right = merge_sort(array[mid..array.size-1])

    # merging left and right arrays
    return merge(left, right)
end

def merge(left, right)
    array_sorted = []
    
    # compare first elements of each left and right array
    # smaller gets pushed to array_sorted and removed from the array
    while left.any? && right.any?
        if left.first >= right.first
        array_sorted << right.shift
        else
        array_sorted << left.shift
        end
    end

    # add left elements to array_sorted
    return array_sorted + left + right
end

array = [5, 1, 10, 2]
p merge_sort(array) # => [1, 2, 5, 10]

Insertion Sort is a simple sorting algorithm. If a large list is given, more advanced sorting algorithms such as quicksort, heapsort, or merge sort are recommended. Therefore, it is used when the list is small and of course it has some advantages.

  • Simple
  • Efficient for small data sets, much like other quadratic sorting algorithms
  • More efficient in practice than most other simple quadratic algorithms such as selection or bubble sort
  • Stable: it does not change the relative order of elements with equal keys
  • In-place: it only requires a constant amount O(1) of additional memory space

Implementation

# iterative version
def insertion_sort(array)
    # we can say that one element is already sorted
    # therefore start with index 1
    (j=1..array.size-1).each do |j|
        key = array[j] # temp value
        
        # insert array[j] into sorted sequence of array[0..j-1]
        # i>= 0 ensures that when i becomes -1 then stop the loop and
        # insert the temp key value 
        i = j-1
        while (i >= 0 && array[i] > key)
            array[i+1] = array[i]
            i -= 1
        end
        array[i+1] = key
    end

    return array
end

array = [5, 1, 10, 2]
p insertion_sort(array, array.size) # => [1, 2, 5, 10]

Complexity

Time Complexity

The worst case happens when a given list is reverse sorted then it takes O(n2) time and the best case happens when a list is already sorted and takes Ω(n).

Space Complexity

It is an in-place algorithm so it takes a constant time which is O(1). As you can see above, only three extra variables are used which are key, i, and j.

Reduce complexity

The time complexity of insertion sort depends on two steps. One is the number of comparisons and the other is the number of movements. If so, can we decrease time complexity by applying a binary search or a linked list? The answer is “NO”.

Assume that we have an array A, then we know A[0..j-1] is sorted. So we can find the correct position of A[j] by applying binary search then it searching time is reduced to O(log n). However, inserting at a correct position takes O(n) because if we insert at an index 0 then all elements from 1 have to be re-placed. Therefore, every search and placement are going to take O(n) time and it has to be done by nearly n-1 time so O(n2).

If we don’t want any movement, a double linked list should be considered. In this case, we can insert an element at correct position directly without affecting position of elements. So it takes O(1) for the movement. However, to find a correct position takes O(n) because it is a linear search. Therefore, it has to be done by nearly n-1 time and takes O(n2).

Comparison Movement  
Binary search O(log n) O(n)
Double linked list O(n) O(1)

A binary heap is a common way of implementing priority queues. It has two additional constracints of a binaray tree.

  • It is a complete binary tree so fully balanced.
  • All the nodes are greater or equal to their children.

The binary heap can be represented as a simple array. The children of an element at a given index i will always be in 2i and 2i+1. The parent of the node is at the index i/2

Implementation

class PriorityQueue
    attr_accessor :queue

    def initialize
        @queue = [nil]
    end

    # element should be inserted in the right place
    def <<(element)
        @queue << element

        bubble_up(@queue.size - 1)
    end

    private
        # exchange the elements if current element at index
        # is smaller than it's parent
        def bubble_up(index)
            parent_index = (index / 2)

            # base cases
            return if index == 1 
            return if @queue[parent_index] >= @queue[index] 

            # exchange elements
            @queue[index], @queue[parent_index] = @queue[parent_index], @queue[index]

            # recursive call
            bubble_up(parent_index)
        end
end

q = PriorityQueue.new
q << 10
q << 5
q << 7
# => [nil, 10, 7, 5]
class PriorityQueue

    # ...

    def pop
        # exchange first and last elements
        @queue[1], @queue[@queue.size - 1] = 
        @queue[@queue.size - 1], @queue[1]

        # remove the last element from the queue
        result = @queue.pop

        # recursive call
        bubble_down(1)
        
        return result
    end

    private 
    
        # ...

        def bubble_down(index)
            child_index = (index * 2)

            return if child_index > @queue.size - 1

            child_left = @queue[child_index]
            child_right = @queue[child_index + 1]

            # if both children not empty then
            # compare left and right
            if child_left != nil && child_right != nil
                child_index += 1 if child_right > child_left
            # if left child is nil then right child is selected
            # no need to check if child_right is nil
            # since child_index is already pointing to left child
            elsif child_left.nil?
                child_index += 1
            end

            # if current is bigger then return
            # else excahnge and bubble down again
            if @queue[index] >= @queue[child_index]
                return
            else
                @queue[index], @queue[child_index] = @queue[child_index], @queue[index]
                bubble_down(child_index)
            end
        end
end

q = PriorityQueue.new
q << 10
q << 5
q << 7
# => [nil, 10, 7, 5]

p q.pop # => 10
p q.pop # => 7
p q.pop # => 5

The following is definition of AVL Tree according to Wikipedia

In computer science, an AVL tree is a self-balancing binary search tree. It was the first such data structure to be invented. In an AVL tree, the heights of the two child subtrees of any node differ by at most one; if at any time they differ by more than one, rebalancing is done to restore this property.

Big-O Complexity

Note: N is the number of nodes in the tree prior to the operation

Lookup, insertion, and deletion all take O(log N) time in both the average and worst cases because it is self-balancing. The diffrence between heights of left and right sub-trees are not more than one for all nodes.

Implementation

class AVLTree
    class Node
        attr_reader :value, :left, :right

        def initialize(value)
            @value = value
            @left  = nil
            @right = nil
        end

        def insert(v)
            case @value <=> v
            when -1 # insert into right subtree
                @right.nil? ? @right = Node.new(v) : @right.insert(v)
            when 1  # insert into left subtree
                @left.nil? ? @left = Node.new(v) : @left.insert(v)
            end
        end
    end

    def initialize(v)
        @root = Node.new(v)
    end

    def insert(v)
        @root.insert(v)
    end

    def is_balanced?(node=@root)
        h_left  = get_height(node.left)
        h_right = get_height(node.right)
        h_diff  = h_left - h_right

        return h_diff.abs > 1 ? false : true
    end

    def get_height(node=@root)
        return 0 if node.nil? # base case

        h_left  = get_height(node.left)
        h_right = get_height(node.right)

        return [h_left, h_right].max + 1
    end
end

# Create an AVL Tree
avl_tree = AVLTree.new(10)
avl_tree.insert(5)
avl_tree.insert(15)

puts avl_tree.get_height   
puts avl_tree.is_balanced? 
puts avl_tree.inspect

The following is definition of Binary Search Tree(BST) according to Wikipedia

In computer science, binary search trees (BST), sometimes called ordered or sorted binary trees, are a particular type of container: data structures that store “items” (such as numbers, names etc.) in memory. They allow fast lookup, addition and removal of items, and can be used to implement either dynamic sets of items, or lookup tables that allow finding an item by its key (e.g., finding the phone number of a person by name).

Key features of BST:

  • Left sub-tree of a node has a key less than to its parent node’s key
  • Right sub-tree of a node has a key greater than to its parent node’s key

Basic Operations

  • Search
  • Insert
  • Pre-order Traversal
  • In-order Traversal
  • Post-order Traversal

Big-O Complexity

Note: N is the number of nodes in the tree

The time complexity of the BST is generally O(logN) for a balanced tree. But for unbalanced, O(N) because it is basically similar to a linear data structure.

This website has a great table for comparing complexities.
Big-O Algorithm Complexity Cheat Sheet

Implementation

=begin
ruby version
=end
class BinarySearchTree
    class Node
        attr_reader :value, :left, :right

        def initialize(value)
            @value = value
            @left  = nil
            @right = nil
        end

        # It skips if the same value is inserted
        def insert(value)
            case value <=> @value
            when 1  # insert to right sub-tree
                @right.nil? ? @right = Node.new(value) : @right.insert(value)
            when -1 # insert to left sub-tree
                @left.nil? ? @left = Node.new(value) : @left.insert(value)
            end
        end
    end

    def initialize(value)
        @root = Node.new(value)
    end

    def print_inorder(node=@root)
        return if node.nil? # base case
    end

    def search(value, node=@root)
        return false if node.nil? # base case
        case node.value <=> value
        when 0  then true
        when -1 then search(value, node.right)
        when 1  then search(value, node.left)
        end
    end
end