CS 327, Spring 2019
Information on the First Test

The first in-class test for this course will take place on Monday, March 4. You should expect a mix of definitions and essay questions; simple code analysis; understanding data structures such as B-Trees and representations of graphs; and applying basic algorithms such as merge sort, heap operations, inserting into a hash table, and graph traversal. Essay-type questions will make up the majority of the test. You will not be asked to code any complex algorithms. You might have to give an outline of an algorithm (such as heapsort or breadth-first search of a graph). You might even have to code something simple (such as quicksort, given the partitioning subroutine.)

The test covers material that is in the following chapters from the textbook (but mostly avoiding the math): Chapters 2, 3, 4, 6, 7, 8, 9, 11, 12, 18, and 22.

Here is a list of some of the things that you should know about:

problems and algorithms

loop invariants
using a loop invariant to prove correctness of a loop
working with loop invariants:  Initialization, Maintenance, Termination

run-time analysis of basic (non-recursive) algorithms
wort case, average case, and best case analysis
"order of growth" of a function
comparing order of growth of common functions:  
       1, log(n), n, n*log(n), n2, n3, 2n, n! 
Big-Oh:    T(n) = O(f(n))  if  T(n) < c*f(n) for some c and for large n
Big-Omega: T(n) = Ω(f(n))  if T(n) > c*f(n) for some c and for large n
Big-Theta: T(n) = Θ(f(n))  if  both T(n) = O(f(n))  and  T(n) = Ω(f(n))
for Ω, Θ, and Big-Oh, you "ignore constant multiples and lower order terms"

recurrence relations
the Master Theorem for solving recurrence relations  T(n) = a*T(n/b) + Θ(f(n)):
     Case 1:  f(n) = O(n^d) where d < logb(a).  Then T(n) = Θ(nlogb(a))
     Case 2:  f(n) = Θ(n^d) where d = logb(a).  Then T(n) = Θ(nlogb(a)*log(n))

data structures
    heaps
    the max-heap property (or min-heap property) of a binary tree
    representing a heap as an array
    heap operations:  insert and removeMax [or removeMin]
    priority queues
    using a heap as a priority queue
    hash tables, hash functions, and hash codes
    collisions in a hash table, and how they are handled
    open hash tables, using chaining to handle collisions
    closed hash tables, using open addressing to handle collisions
    linear probing in a closed hash table
    given a good hash function, operations on a hash table run in time Θ(1)
    binary search tree
    height of a tree; balanced binary tree
    insert, delete, find operations take time Θ(log(n)) on a balanced binary tree
    B-Trees
    minimal degree of a B-Tree
    data structures on secondary storage
    dynamic arrays
    amortized cost of adding an item to a dynamic array

sorting algorithms and their run-time analysis:
    any sorting algorithm that works by comparisons has run time Ω(n*log(n))
    in-place sorting algorithm (selection sort, insertion sort, heapsort)
    stable sorting algorithm (insertion sort, mergesort, counting sort, radix short)
    selection sort (run time Θ(n2))
    insertion sort (average run time Θ(n2); best case Θ(n))
    mergesort (run time Θ(n*log(n))
    heapsort (run time Θ(n*log(n))
    quicksort (worst case run time Θ(n2), average case Θ(n*log(n)))
    the quicksort partitioning algorithm
    randomizing the choice of pivot in quicksort, and why it's a good idea
    counting sort (run time Θ(n), but only applies in certain cases)
    radix sort (run time Θ(n), but only applies in certain cases)

graphs:
    vertices and edges
    directed graphs and undirected graphs
    adjacency matrix representation of a graph
    adjacency list representation of a graph
    path in a graph
    cycle
    basic graph traversal algorithms
    marking vertices as "undiscovered", "discovered", and "processed"
    breadth-first search
    implementation of breadth-first search with a queue
    recursive depth-first search


To give you some idea of the kind of questions that might be on the test, here are just a few questions from old CS 327 exams:

1. Suppose A(n) is the average-case run time and W(n) is the worst-case run time of the same algorithm. True or false: A(n) = O(W(n)). Explain your answer. [With this question, I was probably too clever for my own good.]

2. The basic operation in the Quicksort algorithm is partitioning using a pivot value. Explain the goal (or "postcondition") of partitioning.

3. Mark each of the following statements as being true or false:   (a) 5n3 + n2 = Θ(n3)    (b) n*log(n) = Θ(n)    (c) n*log(n) = Ω(n)    (d) n5 = O(n!)    (e) n + log(n) Θ(n)    (f) n3 + n2 = Θ(n5)   

4. Give an algorithm (in clear pseudocode, with comments where needed) for breadth-first search of a graph, starting from some given vertex v.

5. Draw the (undirected) graph that has the following adjacency matrix representation: [Insert some square matrix of zeros and ons here.]

6. B-trees are balanced trees. Explain what this means and why it is important. (What does being balanced say about the run time for inserting, deleting, and searching for items in the tree?)

7. Consider the heap shown below as a binary tree. Draw the heap after the following operations are performed, one ofter the other: insert(24), removeMax(), removeMax(). [Insert a max-heap represented as a binary tree here.]