CS 168: The Modern Algorithmic Toolbox
- 4/15: Mini-project #3 is available. It is due Tuesday, April 23rd at 11:59pm.
- 4/9: Mini-project #2 is available.
It is due Tuesday, April 16th at 11:59pm. The dataset is available here.
- Here is some starter code for drawing
heatmaps in Python (and/or check Stack Overflow).
- 4/9: Here
is a review (courtesy of EE263) of the most basic aspects of vectors and matrices (e.g., how
to multiply them). Most relevant for CS168 are pages 1--9 (up to "Block
matrices and submatrices") and the "Linear functions" and "Linear
equations" sections on pages 10--11.
- 4/8: Mini-project #2 will be posted in the morning....
- 4/4: TAs' office hours are posted below. Melody will hold the first office hour Friday, 3:45-5:45pm, in Gates 119.
- 4/1: Mini-project #1 is available.
It is due Tuesday, April 9th (at 11:59pm). Please submit via Gradescope, entry code 9NYKNZ.
- Here is some starter code for drawing
histograms in Python (and/or check Stack Overflow).
- 4/1: First class, 1:30-2:50pm in 420-040. Welcome to CS168!
Gregory Valiant (Office hours: Mon 3-4pm, Gates 470. Email: last name at stanford.edu).
(Office hours: Thurs 3-5pm, Gates 506,
Email: jbarratt at stanford.edu).
(Office hours: Mon 4-6pm, Gates 498,
(Office hours: Fri 3:45-5:45, Gates 159 (except for 5/10, in 104),
Email: mguan at stanford.edu).
(Office hours: Wed 5:30-7:30, Gates 159,
Time/location: 1:30 - 2:50pm Mon/Wed in 420-040.
Piazza site: Here.
Prerequisites: CS107 and CS161, or permission from the instructor.
This course will provide a rigorous and hands-on
introduction to the central ideas and algorithms that constitute the
core of the modern algorithms toolkit. Emphasis will be on
understanding the high-level theoretical intuitions and principles
underlying the algorithms we discuss, as well as developing a concrete
understanding of when and how to implement and apply the algorithms.
The course will be structured as a sequence of one-week
investigations; each week will introduce one algorithmic idea, and
discuss the motivation, theoretical underpinning, and practical
applications of that algorithmic idea. Each topic will be accompanied
by a mini-project in which students will be guided through a practical
application of the ideas of the week. Topics include modern techniques
in hashing, dimension reduction, linear and convex programming,
gradient descent and regression, sampling and estimation, compressive
sensing, and linear-algebraic techniques (principal components
analysis, singular value decomposition, spectral techniques).
Week 1: Modern Hashing
- Lecture 1 (Mon 4/1): Course introduction. Consistent hashing.
- Lecture 2 (Wed 4/3):
Property-preserving lossy compression.
From majority elements to approximate heavy hitters.
From bloom filters to the count-min sketch.
Week 2: Data with Distances (Similarity Search, Nearest Neighbor,
Dimension Reduction, LSH)
- Lecture 3 (Mon 4/8):
(Dis)similarity metrics: Jaccard, Euclidean, Lp.
Efficient algorithm for finding similar elements in small/medium (ie. <20)
dimensions using k-d-trees.
- Lecture 4 (Wed 4/10):
Curse of Dimensionality, kissing number.
Estimating Jaccard similarity using MinHash.
JL dimensionality reduction.
- A nice survey of "kissing number", and some other strange phenomena from high dimensional spaces:
Kissing Numbers, Sphere Packings, and some Unexpected Proofs (from 2000).
- Origins of MinHash at Alta Vista:
and Filtering Near-Duplicate
Documents (from 2000).
- Ailon/Chazelle, Faster
Dimension Reduction, CACM '10.
Hashing Algorithms for Approximate Nearest Neighbor in High
Dimensions, CACM '08.
- For much more on LSH, see this chapter of
the CS246 textbook (by Leskovec, Rajaraman, and Ullman).
Week 3: Generalization and Regularization
- Lecture 5 (Mon 4/15):
Generalization (or, how much data is enough?).
Learning an unknown function from samples from an unknown distribution.
Training error vs. test error. PAC guarantees for linear classifiers. Empirical risk minimization.
- Lecture 6 (Wed 4/17): Regularization. The polynomial embedding and random projection, L2 regularization, and L1 regularization as a computationally tractable surrogate for L0 regularization.
- A recent paper arguing that, to understand why deep learning works, we need to rethink the theory of generalization. This paper is quite controversial, with one camp thinking that its conclusions are completely obvious, and the other camp thinking that it is revealing an extremely deep mystery. You decide for yourself! Paper is here.
Week 4: Linear-Algebraic Techniques:
Understanding Principal Components Analysis
- Lecture 7 (Mon 4/22):
Understanding Principal Component Analysis (PCA).
Minimizing squared distances equals maximizing variance.
Use cases for data visualization and data compression.
Failure modes for PCA.
- A nice exposition by 23andMe of the fact that the top 2 principal components of genetic SNP data of Europeans essentially recovers the geography of europe: nice exposition w. figures. Original Nature paper: Genes mirror geography in Europe, Nature, Aug. 2008.
(see also this blog post)
- There's tons of PCA tutorials floating around the Web (some good, some
not so good), which you are also permitted to refer to.
- Lecture 8 (Wed 4/24):
How PCA works. Maximizing variance as finding the
"direction of maximum stretch" of the covariance matrix.
The simple geometry of "diagonals in disguise."
The power iteration algorithm.
Week 5: Linear-Algebraic Techniques: Understanding the Singular Value Decomposition
Week 6: Spectral Graph Theory
Week 7: Sampling and Estimation
Week 8: The Fourier Perspective (and other bases)
Week 9: Sparse Vector/Matrix Recovery (Compressive Sensing)
Bonus Lecture: Privacy-Preserving Computation
- Assignments (75%): There will be 9 weekly mini-projects centered around the topics covered that week. Each mini-project contains both written and programming parts. The projects can be done individually or in pairs. If you work in a pair, only one member should submit all of the relevant files.
For the written part, you are encouraged to use LaTeX to typeset your homeworks;
we've provided a template for your
convenience. We will be using the GradeScope online submission system. You should have received an email saying that you've been enrolled in CS168 on Gradescope. If not, create an account on Gradescope using your Stanford ID and join CS168 using entry code 9NYKNZ.
For the programming part, you are encouraged to use matlab (tutorial), Numpy and Pyplot in Python (Python tutorial, Numpy tutorial, Pyplot tutorial), or some other scientific computing tool (with plotting). Here is a comprehensive python tutorial using IPython Notebook. IPython Notebook is an interactive computational environment, especially useful for scientific computing (tutorial on how to set up). For easy reference, you can also view the notebook here.
Assignments are released on Mondays, and are due at 11:59pm on Tuesdays the following week (both the written and the programming parts). No late assignments will be accepted, but we will drop your lowest assignment grade when calculating your final grade.
- Exam (25%):
Date: Monday, June 10th, 3:30 - 6:30 pm.
Except where otherwise noted, you may refer to your course notes, the
textbooks and research papers listed on the course Web
page only. You cannot refer to textbooks, handouts, or research
papers that are not listed on the course home page. If you do use any
approved sources, make you sure you cite them appropriately, and make
sure that all your words are your own.
You are also permitted to use general resources for whatever programming
language you choose to use.
You can discuss the problems verbally at a high level with other groups. And of course, you are encouraged to contact the course staff (via Piazza or office hours) for additional help.
Please follow the honor code.