STATS214 / CS229M: Machine Learning Theory
Stanford / Winter 2020-2021
- Lectures: Mon/Wed 4-5:20pm
- Chen Cheng
- Sifan Liu
- Jingyi Kenneth Tay
- Kangjie Zhou
Please use Piazza
for all questions and discussions.
When do machine learning algorithms work and why? How do we formalize what it means for an algorithm to learn from data? How do we use mathematical thinking to design better machine learning methods?
This course focuses on developing a theoretical
understanding of the statistical properties of learning algorithms.
- Generalization bounds (Rademacher complexity)
- Implicit/algorithmic regularization
- Online learning
- Bandits problems
- Domain shift
Please see this doc
for all the logistic information and some frequently asked questions.
You can find all tex and pdf files for scribed notes in this github repo
. You can also click on lectures in the following shcedule to view each individual scribe note.
Week 1 Week 2 Week 3 Week 4 Week 5 Week 6 Week 7 Week 8 Week 9
- Mon 03/08:
- Wed 03/10:
- Wed 03/10: Homework 3 due
- Mon 03/15:
- Wed 03/17:
- Fri 03/19: Paper review due
There is no required text for the course. A number of useful references: