CS 161: Schedule

Semih Salihoglu, Stanford University, Summer Quarter 2014

Week 6

Monday, 07/28:

  • Topics:

    • Intro to Dynamic Programming

      • Linear Independent Set

      • Sequence Alignment

  • Readings:

    • KT: 6.1 - 6.4, 6.6

  • Slides

Wednesday, 07/30:

  • Topics:

    • Bellman Ford Algorithm

    • Floyd Marshall Algorithm

  • Readings:

    • KT: 6.8, 6.10, 6.9 (Optional)

  • Slides

Week 5

Monday, 07/21:

  • Topics:

    • Karger's Min Cut

    • Approximate Max Cut

  • Readings:

    • KT: 13.2

  • Slides

Wednesday, 07/23:

  • Topics:

    • HashTables & Universal Hashing

  • Readings:

    • KT: 13.6

  • Slides

Week 4

Monday, 07/14:

  • Topics:

    • Kruskal's Running Time

    • Huffman Codes and Data Compression

  • Readings:

    • KT: 4.6, 4.8

  • Slides

  • Note:

    • On Wednesday we will go over the optimality proof of Huffman codes before starting with Randomized Algorithms. For those of you interested in the history of algorithms, the invention of Huffman codes has a very interesting story. Wikipedia gives a nice narration:
      “In 1951, David A. Huffman and his MIT information theory classmates were given the choice of a term paper or a final exam. The professor, Robert M. Fano, assigned a term paper on the problem of finding the most efficient binary code. Huffman, unable to prove any codes were the most efficient, was about to give up and start studying for the final when he hit upon the idea of using a frequency- sorted binary tree and quickly proved this method the most efficient.3 In doing so, the student outdid his professor, who had worked with information theory inventor Claude Shannon to develop a similar code. By building the tree from the bottom up instead of the top down, Huffman avoided the major flaw of the suboptimal Shannon-Fano coding.”
      Here's a very nice article written by Scientific American about Huffman from 1991. The following short excerpt from the article teaches a very important lesson from this history: “Huffman says he might never have tried his hand at the problem—much less solved it at the age of 25—if he had known that Fano, his professor, and Claude E. Shannon, the creator of information theory, had struggled with it. 'It was my luck to be there at the right time and also not have my professor discourage me by telling me that other good people had struggled with this problem,’ he says.”

Wednesday, 07/16:

  • Topics:

    • Introduction to Randomized Algorithms

    • Quicksort, Quickselect

  • Readings:

    • KT: 13.1, 13.3 (required probability review), 13.5

  • Slides

Week 3

Monday, 07/07:

  • Topics:

    • Introduction to Greedy Algorithms

    • Interval Scheduling

  • Readings:

    • KT: 4.1 - 4.2

  • Slides

  • Notes:

    • Broder et al's paper on the structure of the web

    • Milgram's paper on the 'small world’ phenomenon.
      The paper is not only famous, but its abstract is written in a style very different than standard academic prose:
      “Fred Jones of Peoria, sitting in a sidewalk cafe in Tunis, and needing a light for his cigarette, asks the man at the next table for a match. They fall into conversation; the stranger is an Englishman who, it turns out, spent several months in Detroit. 'I know it's a foolish question,’ says Jones, 'but did you ever by any chance run into a fellow named Ben Arkadian? He's an old friend of mine, manages a chain of supermarkets in Detroit…’
      'Arkadian, Arkadian,’ the Englishman mutters. 'Why, upon my soul, I believe I do! Small chap, very energetic, raised merry hell with the factory over a shipment of defective bottlecaps.’
      'No kidding!’ Jones exclaims in amazement.
      'Good lord, it's a small world, isn't it?’”

Wednesday, 07/09:

  • Topics:

    • Minimum Spanning Trees

  • Readings:

    • KT: 4.5 - 4.6

    • Handout 8

  • Slides

Week 2

Monday, 06/30:

  • Topics:

    • Review of Heaps and Priority Queues

    • Graph Primitives

    • Dijkstra's Single-Source Shortest Paths Algorithm

    • Connected Components in Undirected Graphs

  • Readings:

    • KT: 2.5, 3.1 - 3.3, 4.4

  • Slides

Wednesday, 7/02:

  • Topics:

    • Topological Sorting a Directed Acyclic Graph

    • Kosaraju's Strongly Connected Components Algorithm in Directed Graphs and its Applications

  • Readings:

    • KT: 3.5, 3.6

    • Handout 7: 3.4

  • Slides

Week 1

Monday, 06/23:

  • Topics:

    • What's an algorithm and what do we mean by analysis of algorithms?

    • Sorting

    • Naive SelectionSort algorithm

    • Introduction to Divide and Conquer algorithms

    • MergeSort algorithm

    • Review of Big-Oh notation

  • Readings:

    • KT: 2.1, 2.2., 2.4, 5.1 (you can skip over the subsections on the substitution method)

  • Slides

Wednesday, 06/25:

  • Topics:

    • Substitution Method

    • Matrix Multiplication

    • Master Theorem

  • Readings:

    • KT: 2.1 (again), 5.1's subsections on substitution method

    • Handout 5: 2.2, 2.5

    • Handout 6

  • Note:
    For those of you who are interested in the history of algorithms, here is Virginia Wiliams’ paper, which is the (current) fastest known matrix multiplication algorithm. The first page describes the history of the problem very nicely. Her algorithm is a modification of the Coppersmith-Winograd algorithm from 1981.
    History of Matrix Multiplication:

    • Before 1969: O(n^3)

    • 1969: Strassen to O(n^{2.808})

    • 1978: Pan to O(n^{2.796})

    • 1979: Bini to O(n^{2.780})

    • 1981: Shoenhage to O(n^{2.522})

    • 1981: Coppersmith and Winograd to O(n^{2.496})

    • 1986: Strassen to O(n^{2.479})

    • 1991: Strassen to O(n^{2.376})

    • 2012: Williams to O(n^{2.3729})