Week 8
Monday, 08/11:
Wednesday, 08/13:
Week 7
Monday, 08/04:
Topics:
Introduction to Intractability
P, NP, NPcompleteness, Reductions
Readings:
Slides
Wednesday, 08/06:
Week 6
Monday, 07/28:
Wednesday, 07/30:
Topics:
Bellman Ford Algorithm
Floyd Marshall Algorithm
Readings:
Slides
Week 5
Monday, 07/21:
Topics:
Karger's Min Cut
Approximate Max Cut
Readings:
Slides
Wednesday, 07/23:
Week 4
Monday, 07/14:
Topics:
Readings:
Slides
Note:
On Wednesday we will go over the optimality proof of Huffman codes before
starting with Randomized Algorithms. For those of you interested in the history
of algorithms, the invention of Huffman codes has a very interesting story.
Wikipedia gives a nice
narration: “In 1951, David A. Huffman and his MIT
information theory
classmates were given the choice of a term paper or a final exam. The
professor, Robert M. Fano,
assigned a term paper on the problem of finding the most efficient binary code.
Huffman, unable to prove any codes were the most efficient, was about to give up
and start studying for the final when he hit upon the idea of using a frequency
sorted binary tree and quickly proved this method the most efficient.3 In
doing so, the student outdid his professor, who had worked with information
theory inventor Claude Shannon to
develop a similar code. By building the tree from the bottom up instead of the
top down, Huffman avoided the major flaw of the suboptimal
ShannonFano coding.”
Here's a very nice article written by Scientific American about Huffman from 1991. The following short excerpt
from the article teaches a very important lesson from this history:
“Huffman says he might never have tried his hand at the problem—much less
solved it at the age of 25—if he had known that Fano, his professor, and
Claude E. Shannon, the creator of information theory, had struggled with it.
'It was my luck to be there at the right time and also not have my professor
discourage me by telling me that other good people had struggled with this
problem,’ he says.”
Wednesday, 07/16:
Week 3
Monday, 07/07:
Topics:
Readings:
Slides
Notes:
Broder et al's paper on the structure of the web
Milgram's paper on the 'small world’ phenomenon.
The paper is not only famous, but its abstract is
written in a style very different than standard academic prose:
“Fred Jones of Peoria, sitting in a sidewalk cafe in Tunis, and needing
a light for his cigarette, asks the man at the next table for a match.
They fall into conversation; the stranger is an Englishman who, it turns
out, spent several months in Detroit. 'I know it's a foolish question,’
says Jones, 'but did you ever by any chance run into a fellow named Ben Arkadian?
He's an old friend of mine, manages a chain of supermarkets in Detroit…’
'Arkadian, Arkadian,’ the Englishman mutters. 'Why, upon my soul, I
believe I do! Small chap, very energetic, raised merry hell with the
factory over a shipment of defective bottlecaps.’
'No kidding!’ Jones exclaims in amazement.
'Good lord, it's a small world, isn't it?’”
Wednesday, 07/09:
Week 2
Monday, 06/30:
Topics:
Review of Heaps and Priority Queues
Graph Primitives
Dijkstra's SingleSource Shortest Paths Algorithm
Connected Components in Undirected Graphs
Readings:
Slides
Wednesday, 7/02:
Topics:
Readings:
KT: 3.5, 3.6
Handout 7: 3.4
Slides
Week 1
Monday, 06/23:
Topics:
What's an algorithm and what do we mean by analysis of algorithms?
Sorting
Naive SelectionSort algorithm
Introduction to Divide and Conquer algorithms
MergeSort algorithm
Review of BigOh notation
Readings:
Slides
Wednesday, 06/25:
Topics:
Substitution Method
Matrix Multiplication
Master Theorem
Readings:
Note:
For those of you who are interested in the history of algorithms, here is
Virginia Wiliams’ paper,
which is the (current) fastest known matrix multiplication algorithm. The
first page describes the history of the problem very nicely. Her algorithm
is a modification of the CoppersmithWinograd algorithm from 1981.
History of Matrix Multiplication:
