Statistics 311/Electrical Engineering 377: Information Theory and Statistics

John Duchi, Stanford University, Fall 2021

Lectures

Tuesday and Thursday, 1:30pm - 3:00pm. Building 200, room 205 (History Corner room 205).

Contact and communication with staff

  • Staff email list: stats311-aut2122-staff appropriate symbol lists.stanford.edu

  • We will use Ed for discussion this quarter.

  • We will use Gradescope for grading and problem set submissions. We have posted the Gradescope entry code on Ed.

Instructor

John Duchi

  • Office hours: Tuesdays and Thursdays, 3:00pm - 4:00pm, 126 Sequoia Hall.

Teaching Assistants

Hilal Asi

  • Office hours: Wednesdays, 4:30pm - 6:00pm, 210 Packard. Note: Week of October 11, these will be Wednesday, 9:00am, 210 Packard.

Prerequisites

Mathematical maturity and any convex combination of Stats 300A, Stats 310A, CS229, EE276a

Description

Information theory was developed to solve fundamental problems in the theory of communications, but its connections to statistical estimation and inference date nearly to the birth of the field. With their focus on fundamental limits, information theoretic techniques have provided deep insights into optimal procedures for a variety of inferential tasks. In addition, the basic quantities of information theory–entropy and relative entropy and their generalizations to other divergence measures such as f-divergences–are central in many areas of mathematical statistics and probability.

The application of these tools are numerous. In mathematical statistics, for example, they allow characterization of optimal error probabilities in hypothesis testing, determination of minimax rates of convergence for estimation problems, demonstration of equivalence between (ostensibly) different estimation problems, and lead to penalized estimators and the minimum description length principle. In probability, they provide insights into the central limit theorem, large deviations theory (via Sanov's theorem and other results), and appear in empirical process theory and concentration of measure. Information theoretic techniques also arise in game playing, gambling, stochastic optimization and approximation, among other areas.

In this course, we will study information theoretic quantities, and their connections to estimation and statistics, in some depth, showing applications to many of the areas above. Except to provide background, we will not cover standard information-theoretic topics such as source-coding or channel-coding, focusing on the probabilistic and statistical consequences of information theory.

Texts

Required:

  • Lecture notes I am preparing for the course. These will change throughout the course, as I am rewriting them and trying to make them reflect the actual content of the course, so be sure to reload them frequently.

Recommended or useful: the following texts are not necessary, but will give additional perspective on the material in the class.

Grading

Your grade will be determined by approximately four problem sets (50%) and a final project (50%).