ALGORITHMS ANIMATOR Program Documentation

 

Massimo Di Pierro

 

 

Table of Content



For more on-line information about these and more algorithms visit NIST.

 

Overview

ALGORITHMS ANIMATOR is a Program written in Python to display graphically some of the most common algorithms.

The program was written to the purpose to educate in the field of computer science under the belief that visualization of algorithm helps students to understand the underlying structure. A selection of algorithms from standard undergraduate textbooks on the subject has been implemented.

Students can built their own objects (Lists, Trees and Graphs) and run any of the implemented algorithms on their objects. Objects and can also be saved and loaded.

The program can also be used to generate PostScript representations of Lists, Trees and Graphs.

The name of the program ALGORITHMS ANIMATOR comes from the name of the algorithm class thought by the author of this program at DePaul University (Chicago, IL, USA).

Only two classes are required to understand the ALGORITHMS ANIMATOR code: class Node and class Link. The former is used to represent a Node in a List, in a Tree and in Graph. The latter is used to represent a link (edge) in a Graph.

Each Node has the following attributes:

Each Link has the following attributes:

The Program has four main menus: [List], [Tree], [Graph] and [Examples]. The Program, at each one time, keeps in memory one list (self.list), one tree (self.tree) and one graph (self.graph). Each of these tree objects can be created, modified, loaded and saved without affecting the other two. List algorithms act on self.list, tree algorithms act on self.tree and graph algorithms act on self.graph. The algorithms do not automatically store the the output back in self. For example if one creates a list and perform an InsertionSort the sorted list is not stored into self.list. To store the sorted list into self.list click on the [Store] button when the frame showing the sorted list is visualized.

Every algorithms runs in its own animation-window. Each animation-windows displays the current animation frame and the following items:

[Back to Index]

List

Definition: A collection of items connected one to another.  In the Program we do not differentiate a List from an Array.

In the ALGORITHMS ANIMATOR Program a list is represented as a Python list. For example to create a list choose [List][Create] and, at the prompt, type one of the following example lists:

[3,5,7,8,3,9]

['a', 'b', 'c', 'd', 'e']

Note that strings need to be in paranthesis.

[Back to Index]

List . Reordering Algorithms

[Back to List | Back to Index]

List . Reordering Algorithms . Swap

Definition: Exchange the values of two nodes in a List, identified by the indices i and j.

The Program takes two integers i and j as input and swaps self.list[i] with self.list[j].

[Back to List | Back to Index]

List . Reordering Algorithms . Randomize

Definition: Shuffle the nodes in a List. We use the Python random generator (module random).

[Back to List | Back to Index]

List . Stack Algorithms

Definition: A collection of items in which only the most recently added item may be removed. The latest added item is at the top. Basic operations are push and pop. Also known as "last-in, first-out" or LIFO.

[Back to List | Back to Index]

List . Stack Algorithms . Push

Insert a node as first element of the list A. Running time is in O(1)

[Back to List | Back to Index]

List . Stack Algorithms . Pop

Remove and return the first element of the list A. Running time is in O(1)

[Back to List | Back to Index]

List . Queue Algorithms

Definition: A set of items in which only the earliest added item may be accessed. Basic operations are add (to the tail) or enqueue and delete (from the head) or dequeue. Delete returns the item removed. Also known as "first-in, first-out" or FIFO.

[Back to List | Back to Index]

List . Queue Algorithms . Enqueue

Insert a node as last element of the list A. Running time is in O(1)

[Back to List | Back to Index]

List . Queue Algorithms . Dequeue

Remove and return the first element of the list A. Running time is in O(1)

[Back to List | Back to Index]

List . Heap Algorithms

Definition: A complete binary tree where every node has a value more extreme (greater or less) than or equal to the key of its parent node (heap-property). A complete binary tree can be seen as a List where the parent-child relationship is implemented through the following algorithms:

def Parent(i):
    return int((i-1)/2)

def Left(i):
    return 2*i+1

def Right(i):
    return 2*i+2

[Back to List | Back to Index]

List . Heap Algorithms . Heapify

Definition: Rearrange a heap to maintain the heap-property, that is, the value of the root node is more extreme (greater or less) than or equal to the keys of its childrean nodes. If the root node's key is not more extreme, swap it with the most extreme child key, then recursively heapify that child's subtree. The child subtrees must be heaps to start.

For an array implementation, heapify takes O(log n) or O(h) time under the comparison model, where n is the number of nodes and h is the height of the tree.

[Back to List | Back to Index]

List . Heap Algorithms . BuildHeap

Definition: Convert a List/Array into a heap by executing heapify progressively closer to the root. For an array of n nodes, this takes O(n) time under the comparison model.

[Back to List | Back to Index]

List . Heap Algorithms . HeapExtractMax

The algorithm return the node with maximum value in a Heap (the root node) and restores the heap-property.

[Back to List | Back to Index]

List . Heap Algorithms . HeapInsert

The algorithm appends a new node in a Heap and restores the heap-property. Running time is in O(log n)

[Back to List | Back to Index]

List . Heap Algorithms . tree(heap) to list

Convert a heap from the tree representation (self.tree) to a list represenation (self.list).

[Back to List | Back to Index]

List . Sort Algorithms

Definition: Arrange items in a predetermined order. There are dozens of algorithms, the choice of which depends on factors such as the number of items relative to working memory, knowledge of the orderliness of the items or the range of the values, the cost of comparing keys vs. the cost of moving items, etc. Most algorithms can be implemented as an in-place sort, and many can be implemented so they are stable.

[Back to List | Back to Index]

List . Sort Algorithms . InsertionSort

Definition: Sort by repeatedly taking the next item and inserting it into the final data structure in its proper order with respect to items already inserted. Running time is in O(n2). Average and worst case are in Theta(n2). In the best case (presorted lists) is in Theta(n).

[Back to List | Back to Index]

List . Sort Algorithms . MergeSort

Definition: A sort algorithm which splits the items to be sorted into two groups, recursively sorts each group, and merges them into a final, sorted sequence. Run time for Merge is Theta(n). Run time for MergeSort is Theta(n log n).

MergeSort provides an example of Divide and Conquer strategy.

[Back to List | Back to Index]

List . Sort Algorithms . MergeSortDP (non-recursive)

Definition: A sort algorithm which splits the items to be sorted into two groups, recursively sorts each group, and merges them into a final, sorted sequence. Run time is Theta(n log n).

MergeSortDP provides an example of Dynamic Programming strategy.

[Back to List | Back to Index]

List . Sort Algorithms . QuickSort

Definition: An in place sort algorithm that uses the divide and conquer paradigm. It picks an element from the array (the pivot), partitions the remaining elements into those greater than and less than this pivot, and recursively sorts the partitions. There are many variants of the basic scheme above: to select the pivot, to partition the array, to stop the recursion on small partitions, etc.

QuickSort provides an example of Divide and Conquer strategy.

QuickSort has running time Theta(n2) in the worst case, but it is typically (average case) O(n log n). In practical situations, a finely tuned implementation of QuickSort beats most sort algorithms, including sort algorithms whose theoretical complexity is O(n log n) in the worst case.

[Back to List | Back to Index]

List . Sort Algorithms . RandomizedQuickSort

Same as the QuickSort but uses RandomizedPartition instead of Partition.

RandomizedQuickSort provides an example of Divide and Conquer strategy.

[Back to List | Back to Index]

List . Sort Algorithms . HeapSort

Definition: A sort algorithm which builds a heap, then repeatedly extracts the maximum item. Run time is O(n log n). In best case it is O(n).

[Back to List | Back to Index]

List . Sort Algorithms . CountingSort

Definition: A 2-pass sort algorithm that is efficient when the range of values is small and there many duplicate keys. The first pass counts the occurrences of each key in an auxiliary list/array, and then makes a running total so each auxiliary entry is the number of preceding keys. The second pass puts each item in its final place according to the auxiliary entry for that key. CountingSort runs in linear time, Theta(n).

[Back to List | Back to Index]

List . Other Algorithms

[Back to List | Back to Index]

List . Other Algorithms . Partition

Definition: Rearrange the elements of a list into two groups, typically, such that elements in the first group are less than a pivot value and elements in the second group are greater. Runs in linear time, Theta(n).

In the present implementation the pivot is choose to be the first element. Note that if the pivot is the minimum in the list, the function partition splits the list into two groups where the first only contains the pivot.

[Back to List | Back to Index]

List . Other Algorithms . RandomizedPartition

Similar to partition but a random element in the list is used as pivot.

[Back to List | Back to Index]

List . Other Algorithms . Minimum

Find Minimum in a List or Array. Runs in linear time, Theta(n).

[Back to List | Back to Index]

List . Other Algorithms . Maximum

Find Maximum in a List or Array. Runs in linear time, Theta(n).

[Back to List | Back to Index]

Tree

(data structure)

Definition: A data structure accessed beginning at the root node. Each node is either a leaf (has not children) or an internal node (has children). An internal node has one or more  child nodes and is called the parent of its child nodes. All children of the same node are siblings. Contrary to a physical tree, the root is usually depicted at the top of the structure, and the leaves are depicted at the bottom.

A connected, undirected, acyclic graph is a Tree.

In the ALGORITHMS ANIMATOR Program a tree is represented as a Python list where the first element of the list is root node and the other elements are the children. A child can be a subtree. For example to create a tree choose [Tree][Create] and, at the prompt, type one of the following example trees:

['root','a','b','c']

['root','a','b',['c',1,[2,7,8],3,4]]

[Back to Index]

Tree . Heap Algorithms

In the Program this menu contains the same algorithms as List . Heap Algorithms but the heaps are displayed in tree representation as opposed to a list representation.

[Back to Tree | Back to Index]

Tree . Heap Algorithms . list to tree(heap)

Convert a list (self.list) to a tree (self.tree).

[Back to Tree | Back to Index]

Tree . Binary Search Tree Algorithms

Definition: A tree with at most two children for each node where every node's left sub-tree has values less than the node's value, and every right sub-tree has values greater. A new node is added as a leaf.

[Back to Tree | Back to Index]

Tree . Binary Tree Algorithms . list to tree(binary)

Convert a list (self.list) into a binary tree (self.tree).

[Back to Tree | Back to Index]

Tree . Binary Tree Algorithms . BST-InorderWalk

Definition: Process all nodes of a tree by recursively processing the left sub-tree, then processing the root, and finally the right sub-tree.

BST-InorderWalk provides an example of Divide and Conquer strategy.

[Back to Tree | Back to Index]

Tree . Binary Tree Algorithms . BST-Search

Search a binary (search) tree for an input value k.

BST-Search provides an example of Divide and Conquer strategy.

[Back to Tree | Back to Index]

Tree . Binary Tree Algorithms . BST-Minimum

Search a binary (search) tree for the node with minimum value.

[Back to Tree | Back to Index]

Tree . Binary Tree Algorithms . BST-Maximum

Search a binary (search) tree for the node with maximum value.

[Back to Tree | Back to Index]

Tree . Binary Tree Algorithms . BST-Insert

Insert a new node in a binary search tree.

[Back to Tree | Back to Index]

Tree . Binary Tree Algorithms . BST-Delete

Delete a node from a binary search tree.

[Back to Tree | Back to Index]

Tree . AVL Tree Algorithms

Definition: A binary tree where the height of the two subtrees (children) of each node differs by at most one (balance property). Look-up, insertion, and deletion are O(log n), where n is the number of nodes in the tree.

[Back to Tree | Back to Index]

Tree . AVL Tree Algorithms . list to tree(AVL)

Convert a list (self.list) into an AVL Tree (self.tree)

[Back to Tree | Back to Index]

Tree . AVL Tree Algorithms . RebalanceNode

Definition: Restore the balance property of the node: the height of the two subtrees (children) of a node differs by at most one. Running time is in O(1).

[Back to Tree | Back to Index]

Tree . AVL Tree Algorithms . AVL-Insert

Insert a new node into an AVL Tree and rebalance the node and its ancestors. Running time is in O(log n) where n is the number of nodes in the tree.

[Back to Tree | Back to Index]

Tree . AVL Tree Algorithms .  AVL-Delete

Delete a node from an AVL Tree and rebalance its ancestors.  Running time is in O(log n) where n is the number of nodes in the tree.

[Back to Tree | Back to Index]

Tree . Other Algorithms

[Back to Tree | Back to Index]

Tree . Other Algorithms . Tree Heigth

Definition: The maximum distance of any leaf from the root of a tree. If a tree has only one node (the root), the height is zero.

[Back to Tree | Back to Index]

Tree . list to tree

Convert a list (self.list) into a tree (self.tree)

[Back to Tree | Back to Index]

Graph

In the ALGORITHMS ANIMATOR Program a graph G={V,E} is represented as a Python list containing two elements:

. For example to create a graph choose [Graph][Create] and, at the prompt, type the following example graph:

[['a', 'b', 'c', 'd'], [[0,0,5], [0,1,3], [0,2,7], [3,1,9]]]

where 'a', 'b', 'c' and 'd' are the values at the nodes and [0,0,5] is a link (of length 5) connecting node 0 (value='a') with itself, [0,1,3] is a link (of length 3) connecting node 0 (value='a') with node 1 (value='b'), etc. Undirected links can be represented as a couple of directed links in opposite direction and same length.

[Back to Index]

Graph . Algorithms

(data structure)

Definition: A set of items connected by links (edges). Each item is called a node or vertex. Formally, a graph G={V,E} is a set of vetices (nodes) V and a set of edges (links) E

[Back to Graph | Back to Index]

Graph . Algorithms . Symmetrize

Convert a directed graph (self.graph) into an undirected graph (self.graph).

[Back to Graph | Back to Index]

Graph . Algorithms . BreadthFirstSearch

Definition: A search algorithm which considers neighbors of a node, that is, outgoing links of the node's predecessor in the search, before any outgoing link of the node. Extremes are searched last. This is typically implemented with a queue. Running time is in Theta(E+V) where V is the number of vertices and E is the number of edges.

[Back to Graph | Back to Index]

Graph . Algorithms . BFS-TopologicalSort

IN PROGRESS

[Back to Graph | Back to Index]

Graph . Algorithms . DepthFirstSearch

Definition: Similar to BreadthFirstSearch but the queue is replaced by a stack. Running time is in Theta(E+V) where V is the number of vertices and E is the number of edges.

[Back to Graph | Back to Index]

Graph . Algorithms . DFS-TopologicalSort

IN PROGRESS

[Back to Graph | Back to Index]

Graph . Algorithms . MST-Kruskal

Definition: An algorithm for computing a minimum spanning tree. It maintains a set of partial minimum spanning trees, and repeatedly adds the shortest link in the graph whose nodes are in different partial minimum spanning trees. Running time is in O(E log V) where V is the number of vertices and E is the number of edges.

MST-Kruskal provides an example of Greedy strategy.

[Back to Graph | Back to Index]

Graph . Algorithms . MST-Prim

Definition: An algorithm for computing a minimum spanning tree. It builds upon a single partial minimum spanning tree, at each step adding a links connecting the node nearest to but not already in the current partial minimum spanning tree. Running time is in O(E + V log V) where V is the number of vertices and E is the number of edges.

MST-Prim provides an example of Greedy strategy.

[Back to Graph | Back to Index]

Graph . Algorithms . Bellman-Ford

Definition: An efficient algorithm to find the shortest path from a single source node to all other nodes in a weighted, directed graph. The algorithm initializes the distance to the source node to 0 and all other vertices to Infinity. It then does V-1 passes (V is the number of nodes) over all links relaxing, or updating, the distance to the destination of each link. Finally it checks each link again to detect negative weight cycles, in which case it returns false. Running time is in O(V E) where V is the number of vertices and E is the number of edges

Bellman-Ford provides an example of Greedy strategy.

[Back to Graph | Back to Index]

Graph . Algorithms . Dijkstra

Definition: An algorithm to find the shortest path from a single source node to all other vertices in a weighted, directed graph. All weights must be nonnegative. Implementing the priority queue with a Fibonacci Heap makes the running time in O(E + V log V), where V is the number of vertices and E is the number of links.

Dijkstra provides an example of Greedy strategy.

[Back to Graph | Back to Index]

Graph . list to graph

Convert a list (self.list) into a graph representation (self.graph).

[Back to Graph | Back to Index]

Graph . tree to graph

Convert a tree (self.tree) into a graph representation (self.graph).

[Back to Graph | Back to Index]

Examples

The menu [Examples] includes those algorithms that use lists, trees or graphs internally but do not belong to any of the other menus.

[Back to Index]

Examples . Make Random List

The Program asks for an integer number, creates a random list and stores the list into self.list.

[Back to Examples | Back to Index]

Examples . Timing (sort)

The Program runs all sorting algorithms and times them on the present architecture for different list sizes and produces a text report. The timing has three modes:

[Back to Examples | Back to Index]

Examples . Tic Tac Toe

The Program produces the look-ahead tree for the Tic Tac Toe game. The user selects the initial moves and the height of the look-ahead tree.

[Back to Examples | Back to Index]

Examples . Huffman Encoding

Definition: A minimal variable-length character encoding based on the frequency of each character. First, each character becomes a trivial tree, with the character as the only node. The character's frequency is the tree's frequency. The two trees with the least frequencies are joined with a new root which is assigned the sum of their frequencies. This is repeated until all characters are in one tree. One code bit represents each level. Thus more frequent characters are near the root and are encoded with few bits, and rare characters are far from the root and are encoded with many bits.

The Program accepts the text to be compresses as input and produces a text report showing compression rules and compressed text. The Program also shows in an animation how the Huffman tree is built.

Huffman encoding provides an example of Greedy strategy.

[Back to Examples | Back to Index]

Examples . Sequence Alignment

Definition: The problem of finding a maximum length (or maximum weight) subsequence of two or more lists/arrays/strings. Running time is in Theta(n m) where n and m is the length of the first and second list respectively.

The Program accepts two strings as input (stored into two separate lists) and computes the Longest Common Subsequence by aligning the two lists.

Determining the LCS (sequence alignment) provides an example of Dynamic Programming strategy.

[Back to Examples | Back to Index]

 

Definitions

[Back to Index]

Definitions . Divide and Conquer

Definition: The Divide-and-Conquer is a method of designing algorithms that (informally) proceeds as follows: Given an instance of the problem to be solved, split this into several, smaller, sub-instances (of the same problem) independently solve each of the sub-instances and then combine the sub-instance solutions so as to yield a solution for the original instance. This description raises the question: By what methods are the sub-instances to be independently solved? The answer to this question is central to the concept of the Divide-and-Conquer algorithm and is a key factor in gauging their efficiency. The solution depends on the problem.

The MergeSort and QuickSort algorithms provide examples of Divide-and-Conquer..

[Back to Definitions | Back to Index]

Definitions . Greedy Strategy

Definition: Greedy algorithms work in phases. In each phase, a decision is made that appears to be good, without regard for future consequences. Generally, this means that some local optimum is chosen. This `take what you can get now' strategy is the source of the name for this class of algorithms. When the algorithm terminates, we hope that the local optimum is equal to the global optimum. If this is the case, then the algorithm is correct; otherwise, the algorithm has produced a suboptimal solution. If the best answer is not required, then simple greedy algorithms are sometimes used to generate approximate answers, rather than using the more complicated algorithms generally required to generate an exact answer. Frequently the Greedy Approach constitutes the basis of a heuristic approach. Even for problems which can be solved exactly by a greedy algorithm, establishing the correctness of the method may be a non-trivial process.

The Kruskal and Prim algorithms provide examples of Greedy Strategy that leads to optimal solution.

[Back to Definitions | Back to Index]

Definitions . Dinamic Programming

Definition: Dynamic Programming is a paradigm that is most often applied in the construction of algorithms to solve a certain class of optimization problems. That is problems which require the minimization or maximization of some measure. One disadvantage of using Divide-and-Conquer is that the process of recursively solving separate sub-instances can result in the same computations being performed repeatedly since identical sub-instances may arise. The idea behind dynamic Programming is to avoid this pathology by obviating the requirement to calculate the same quantity twice. The method usually accomplishes this by maintaining a table of sub-instance results. We say that Dynamic Programming is a Bottom-Up technique in which the smallest sub-instances are explicitly solved first and the results of these used to construct solutions to progressively larger sub-instances. In contrast, we say that the Divide-and-Conquer is a Top-Down technique.

The MergeSortDP algorithm provides an example of Dynamic Programming.

[Back to Definitions | Back to Index]

Definitions . Memoization

Definition: An algorithmic technique which saves (memoizes) a computed answer for later reuse, rather than re-computing the answer each time it is needed. Usually the Memoization approach falls into the Dyncamic Programming even if it is a top-down approach. The reason being that both techniques use tables to store intermediate results.

[Back to Definitions | Back to Index]

Definitions . Backtracking

Definition: An algorithmic technique to find solutions by trying one of several choices. If the choice proves incorrect, computation backtracks or restarts at the point of choice and tries another choice. It is often convenient to maintain choice points and alternate choices using recursion.

[Back to Definitions | Back to Index]

Definitions . Order of Growth

Definition: A theoretical measure of the execution of an algorithm, usually the time or memory needed, given the problem size n, which is usually the number of items. Informally, saying some equation f(n) = O(g(n)) means it is less than some constant multiple of g(n). The notation is read, "f of n is big oh of g of n".

Formal Definition: f(n) = O(g(n)) means there are positive constants c and k, such that 0 <= f(n) <= cg(n) for all n<= k. The values of c and k must be fixed for the function f and must not depend on n.

Formal Definition: f(n) = Omega(g(n)) means there are positive constants c and k, such that 0 <= cg(n) <= f(n) for all n <= k. The values of c and k must be fixed for the function f and must not depend on n.

Formal Definition: f(n) = Theta(g(n)) means there are positive constants c1, c2, and k, such that 0 <= c1g(n) <= f(n) <= c2g(n) for all n <= k. The values of c1, c2, and k must be fixed for the function f and must not depend on n.

[Back to Definitions | Back to Index]