This course ended on May 6, 2018 |
CPSC 327: Data Structures and Algorithms
Department of Mathematics and Computer Science Hobart and William Smith Colleges Spring, 2018. Instructor: David J. Eck (eck@hws.edu) Monday, Wednesday, Friday, 12:20–1:15 PM. Room Gulick 2001 Course Handout: http://math.hws.edu/eck/courses/cpsc327_s18.html
Homework | |||
Homework 1 Due January 24 |
Homework 2 Due January 31 |
Homework 3 Due February 7 |
Homework 4 Due February 14 |
Homework 5 Due February 21 Due February 23 |
Homework 6 Due March 2 |
Homework 7 Due March 30 |
Homework 8 Due April 13 Re-do due Apr 23 |
Fifteenth Week: April 30
The final exam is at 1:30 PM on Sunday, May 6. A study guide was handed out in class. We will spend Monday on questions about the exam.
My slightly revised office hours for this week are as follows (although I will actually be availalbe at other times as well):
Wednesday, May 2: 2:00 to 4:00 Thursday, May 3: 1:00 to 2:30 and 3:00 to 3:50 Friday, May 4: 1:00 to 4:20 Saturday, May 5: 11:00 to 2:00 Sunday, May 6: 12:00 to 1:20
Fourteenth Week: April 23, 25, and 27
In the last full week of classes, we will finish our look at the P and NP classes of decision problems. I will attempt to convince you that CIRCUIT-SAT is NP-complete, and I hope to have time to do a couple of reductions to prove NP-completeness of other problems. We will look briefly at the possibility of approximation algorithms for NP-complete problems. By the end of the week, we should be doing some review for the final exam.
Thirteenth Week: April 16, 18, and 20
We are continuing with Chapter 9, covering the idea of problem reduction and NP-completeness. We look at some specific reductions and at some specific "hard" problems such as Hamitonian cycle, the vertex cover and clique problems for graphs, CIRCUIT-SAT and SAT.
Twelfth Week: April 9, 11, and 13
After finishing up Dynamic Programming, we move on to Chapter 9 on Friday. We will discuss the meaning of "P" and "NP".
Eleventh Week: April 2, 4, and 6
We are moving on to Chapter 8, which covers "dynamic programming." Dynamic programming is a technique that can greatly reduce the run time of a recursive algorithm, in cases where subproblems are re-solved many times by the recursive algorithm. The reading is Chapter 8, Sections 1, 2, 3, and 5. We might not cover all the examples in these sections, and we will cover some examples that are not in the reading. Nevertheless, it won't hurt you to do the entire reading.
Tenth Week: March 26, 28, and 30
We will spend the week on the general topic of "state space search." The same general topic is covered in Chapter 7 of the textbook, but there is no assigned reading in the book for this week. We will discuss searching a tree of states, with "pruning" to cut down on the number of possibilities that need to be considered. One example of this is minimax with alpha/beta pruning, which is used for searching game trees. Later in the week, we will discuss the genetic algorithm and/or simulated annealing as an example of heuristic search.
Ninth Week: March 12, 14, and 16
Next week is Spring Break. Have fun!
We will finish up Chapter 6 this week, covering algorithms for weighted graphs. We will cover Prim's algorithm and Kruskal's algorithm for minimal spanning trees. For Kruskal's algoritm, we will need the union/find data structure, which is interesting in its own right. We will also cover Dijkstra's algorithm and Floyd's algorithm for finding shortest paths. If there is time on Friday, we will talk briefly about network flow. But in any case, we will move on to Chapter 7 after break.
Eighth Week: March 5, 7, and 9
There is a test in class on Wednesday. A study guide was handed out in class on Friday. The take-home part of the test will be available on Wednesday. There will be time in class on Monday to ask questions about the material that will be on the test. There will be time in class on Friday to ask questions about the take-home part. The take-home part of the test is due next Monday.
Aside from the test, we will be starting Chapter 6, which covers several important graph algorithms. We will start with algorithms for finding minimal spanning trees.
Seventh Week: February 26 and 28; March 2
This week, we will be covering basic graph traversal algorithms and their applications, such as finding connected components, topological sort of a directed graph, and checking whether a graph is bipartite. The reading is all of Chapter 5 except the "war stories" (5.3 and 5.4) and the section on articulation points (5.9.2).
There is a test coming up next week. The in-class part of the test will be next Wednesday, March 7. The take-home part of the test will be handed out on March 7 and will (probably) be due the following Monday.
Sixth Week: February 19, 21, and 23
We are still working on Chapter 4, spending some extra time on quicksort and its variations. We will move on to Chapter 5 on Friday at the latest. Chapter 5 covers graphs and basic graph-processing algorithms. (A graph here consists of a set of "vertices" and some "edges" that connect pairs of vertices.)
The due date for the homework this week has been postponed from Wednesday to Friday.
Fifth Week: February 12, 14, and 16
We wil continue with Chapter 4. The reading is Sections 4.5, 4.6, 4.7, 4.9, and 4.10. We will discuss some specific sorting algorithms, including quicksort and mergesort. But we will also talk about "divide and conquer" as a general strategy for developing recursive algorithms. And we will discuss recurrence relations and the "Master Theorem" that gives a solution for recurrence relations of a certain form.
Fourth Week: February 5, 7, and 9
We will start the week talking about hash tables. We will cover both open hashing and closed hashing. We will then move on to the "heap" data structure and show how it is used to implement the priority queue abstract data type and how it is used in the heap sort algorithm. (Note: This is not the same heap that is used for dynamically allocated memory.) After that, there will be more to say about sorting, but that might not be until next week.
You should start reading Chapter 4, and you should definitely read sections 4.1 through 4.3. (Section 4.3 covers heaps and heapsort.)
If you want, you can also read my notes from 2004 about Hash Tables and about Priority Queues and HeapSort.
Third Week: January 29 and 31; February 2
On Wednesday, we will move on to Chapter 3 in the textbook, which discusses some basic data structures. Most of these you already know (arrays, linked lists, stacks, queues, binary trees, and maybe dictionaries and hash tables). We will cover them rather quickly, but some of the details should be new to you. We will also cover priority queues, which might be entirely new to you.
Note that some of the data structures listed are actually abstract data types; that is, they are specified as a set of abstract operations on the data, without specifying how the operations are implemented. Stack, queue, dictionary, and priority queue are abstract data types. On the other hand, arrays, linked lists, binary trees, and hash tables are actual concrete, physical data types.
Second Week: January 22, 24, and 26
We are working on some of the mathematical background for "analysis of algorithms," in particular the Big-Oh, Big-Theta, and Big-Omega notations. The reading is Chapter 2, Sections 1 through 7. (If you are interested in reading my version of this material, see Chapter 1 from a set of notes that I wrote when I taught a Data Structues and Algorithms course in 2004.)
First Week: January 17 and 19
Welcome to the course!
The textbook for the course is: The Algorithm Design Manual, by Steven Skiena, 2nd Edition Paperback, ISBN: 978-1849967204.
The reading for the week is Chapter 1, Sections 1 to 4. I will not be assigning Sections 5 or 6. Section 6 is a "War Story", that is, an account of one of the author's practical experiences with algorithms. I will not require you to read the War Stories in the textbook, but they can be interesting and instructive, so you should consider taking a look at them even though they are not assigned.