CPSC 327 Data Structures and Algorithms Spring 2011

CPSC 327 Syllabus

Readings and thinking problems are to be done for the class period where they are listed.

Dates for things in light gray are tenative and may shift slightly.


Week 1: 1/17-1/21

Topics: introduction; algorithm analysis


Wed Notes: course introduction    

Fri Reading: Edmonds ch 23.1, 24 (review if you aren't familiar with the mathematics of logs and exponents), 25 (introduction and section 25.1 only)

Thinking Problem: An algorithm takes n English words as input. Is the number of words a fair measure of input size? Explain.

Notes: time complexity

homework #1
due Wed 1/26

Week 2: 1/24-1/28

Topics: algorithm analysis


Mon Notes: big-Oh, big-Theta, big-Omega  

Wed Reading: Edmonds ch 26 (introduction and section 26.1 only)

Notes: homework 1, T(n) from algorithms

homework #2
due Mon 1/31

Fri Notes: homework 1, T(n) from algorithms, sums

Week 3: 1/31-2/4

Topics: iterative algorithms


Mon Reading: Edmonds chapter 1

Thinking Problem: The algorithm development steps presented in the chapter help you figure out the algorithm, but they don't directly produce pseudocode or code. Write pseudocode or Java code for the binary search algorithm developed in example 1.4.3 (pages 24-25), then point out how that matches up with the 13 steps given. (The idea here is to think about how the pieces of information from each step come together into the complete algorithm.)

Notes: steps for developing iterative algorithms

homework #3
due Wed 2/2
project #1
Sort Analysis
due Mon 2/14

Wed Reading: Edmonds chapter 2

Thinking Problem: Give a more-of-the-input style loop invariant for the following problem: A is a sorted array of length n and B is a sorted array of length m. Produce C, a sorted array containing the elements from A and B (including duplicates, if there are any).

Notes: using the steps to develop iterative algorithms


Fri Reading: Edmonds chapters 4-5

Notes: basic steps, measures of progress, and loop invariants

homework #4
due Wed 2/9

Week 4: 2/7-2/11

Topics: iterative algorithms; greedy algorithms

Mon Notes: more of the output and narrowing the search space

Wed Reading: Edmonds sections 16.1-16.2.2

Thinking Problem: Exercise 16.1.1 (page 235). Justify your answer in the style of the proof in the chapter (a proof or a counterexample) - don't just state "yes" or "no".

Notes: greedy algorithms
homework #5
due Fri 2/11

Fri Notes: greedy algorithms (staying ahead)  

Week 5: 2/14-2/18

Topics: greedy algorithms; ADTs


Mon Notes: staying ahead (making change) homework #6
due Wed 2/16

Wed Notes: staying ahead (driving to Seattle, people on hold) homework #7
due Fri 2/18

Fri Reading: Edmonds chapter 3

Thinking Problem: Categorize the ADTs presented in the reading according to how elements are positioned and the primary kinds of methods the collection supports:

  • how elements are positioned: algorithmically (the ADT determines how elements are arranged) or manually (the user of the ADT has control over how elements are arranged)
  • what methods the collection supports: determine if an element is a member of the collection, after/before (methods based on ordering of elements), access elements with highest priority, access elements at any position, access elements only at ends (first/last)

Notes: a couple of comments on hw #6 but mostly ADTs and thinking about implementation details

  midterm #1
due Wed 2/23

(review information)

Week 6: 2/21-2/25

Topics: implementing ADTs


Mon Notes: ADTs and data structures  

Wed Reading: heaps (start with slide 9 on page 5 - "What is a heap?")

Notes: priority queues and heaps

homework #8
due Fri 2/25

Fri Reading:

Thinking Problems:

  1. Insert the elements 50, 60, 70, 20, 10, 25, 40, 30, 28 into an initially empty AVL tree, drawing the tree after each step.
  2. Insert the elements 5, 16, 22, 45, 2, 10, 18, 30, 50, 12, 1 into an initially empty 2-4 tree, drawing the tree after each step.

Notes: array implementation of heaps, AVL trees

  project #2
Algorithm X
due Fri 3/11

Week 7: 2/28-3/4

Topics: implementing ADTs


Mon Notes: AVL trees, 2-4 trees homework #9
due Wed 3/2

Wed Reading: hashtables (skip the parts with code, and the sections titled "Interface for..." and "Implementation of...") and problems with hashtables

Thinking Problems:

  • How do you delete an element from a hashtable?
  • How likely are collisions? Do a calculation: assuming that each day is equally likely as someone's birthday, what is the probability that there are no shared birthdays in a class of 30 students?

Notes: hashtables


Fri Notes: project 1 comments, and a little bit of the reality of hashtables homework #10
due Wed 3/9

Week 8: 3/7-3/11

Topics: graphs and graph algorithms

Mon Notes: end of hashtables and recap of data structure implementation

Wed Reading: Edmonds section 14.1

Thinking Problem: exercise 14.1.1 (page 178) - for each search technique, start with node s and list the nodes in the order in which they are handled

Notes: graphs; Graph ADT; implementing the Graph ADT


Fri Reading: Edmonds sections 14.2-14.4, 14.6 (if you only have time to read one section before class, read 14.3 first)

Thinking Problem: exercise 14.3.3, parts #4 and #5 only (page 188)

Notes: reachability (and another example of developing iterative algorithms + runtime)


Spring Break

Week 9: 3/21-3/25

Topics: graph algorithms; recursion


Mon Notes: unweighted shortest path homework #11
due Wed 3/23

Wed Reading: Edmonds section 16.2.3

Notes: weighted shortest path (Dijkstra's algorithm), minimum spanning tree (Kruskal's algorithm)

homework #12
due Mon 3/28

Fri Reading: Edmonds chapter 8

Thinking Problem: Construct a recursive algorithm for the task of finding the smallest element in an array. (Yes, I know there is a simple iterative algorithm. That's not the point here.) Write your algorithm in code or pseudocode, and demonstrate that you've satisfied the checklist for recursive algorithms given in section 8.5.

Notes: developing recursive algorithms


Week 10: 3/28-4/1

Topics: recursion and recurrence relations


Mon Reading: Edmonds section 27.1

Thinking Problem: Do either #1 or #2 and any one of #3-#7 (two problems total) from exercise 27.1.1 (pages 400-401) - without looking at the solution in the back of the book.

Notes: solving recurrence relations

homework #13
due Wed 3/30
project #3
Travel Agent
due Wed 3/13

Wed Reading: Edmonds chapter 9

Thinking Problem: In general, having fewer subproblems and reducing the amount of work needed to create and combine subproblems will make a recursive algorithm run faster. But what about specific advice? Imagine that you've developed a recursive algorithm with a time T(n) = a T(n/b) + f(n). Would it be more productive to think about how to reduce the number of subproblems (as was done in the exponentiation, multiplication, and matrix multiplication examples) or how to reduce the work done splitting/combining? Does your answer depend on the particular equation for T(n)? If so, how?

Notes: messy sums and recurrence relations that don't fit the table

homework #14
due Fri 4/1

Fri Notes: thoughts about the cheapest ticket and fewest flight legs problems, plus more about solving recurrence relations where the friends get problems of different sizes  

Week 11: 4/4-4/8

Topics: divide-and-conquer


Mon Notes: divide and conquer homework #15
due Wed 4/6

Wed in-class exercises  

Fri Notes: divide and conquer (inversions, skyline, tournaments)  

Week 12: 4/11-4/15

Topics: recursive backtracking and branch-and-bound


Mon Reading: Edmonds chapter 17

Thinking Problem: exercise 17.5.1 (page 265), without looking at the solutions in the back of the book

Notes: recursive backtracking


Wed Notes: recursive backtracking, branch and bound   midterm #2
due Mon 4/18

(review information)

Fri Notes: branch and bound  

Week 13: 4/18-4/22

Topics: dynamic programming


Mon Reading: Edmonds sections 18.1-18.2

Thinking Problems:

  1. The book discusses turning a recursive backtracking algorithm into an iterative dynamic programming algorithm. Does a dynamic programming algorithm have to be iterative? That is, could you write it recursively without changing the big-Oh? Explain.
  2. Where does the polynomial running time come from?

Notes: a couple of notes of determining running times of algorithms, branch and bound wrapup, dynamic programming

homework #16
due Fri 4/22
project #4
due Tue 5/3

Wed Reading: Edmonds section 18.3

Thinking Problem: List two pieces of advice and/or strategies for developing dynamic programming algorithms.

Notes: dynamic programming (longest common subsequence)

Fri Reading: Edmonds sections 19.1-19.4

Notes: dynamic programming (weighted event scheduling, matrix chain product)

homework #17
due Wed 4/27

Week 14: 4/25-4/29

Topics: dynamic programming; complexity

Mon Reading: Edmonds section 19.8

Thinking Problem: exercise 19.8.1 (just explain how map the elephant problem to the graph problem, and how to turn a solution to the graph problem into a solution for the elephant problem - you do not need to repeat how to solve the graph problem)

Notes: solving problems; reductions

Wed Reading: Edmonds chapter 20 (read 20.2-20.3 just for an overview of the ideas - don't worry about the details)

Thinking Problem: exercise 20.1.1 (choose three of the parts)

Notes: reductions (event tour, elephants, billboards)

homework #18
due Sat 5/7 10pm
(Thu 5/5 5pm for feedback)

Fri Notes: dynamic programming (winning the championship, string edit distance) plus a tiny bit of complexity

Week 15: 5/2-5/3

Topics: complexity; wrapup

Mon Notes: complexity


Exams: 5/7-5/10

final exam
due Sat 5/7 10pm

(review information)

Sat end-of-semester deadline
no work accepted after 5/7 10pm




Valid HTML 4.01!