Statistics 311/Electrical Engineering 377: Information Theory and Statistics

John Duchi, Stanford University, Fall 2023

Lectures

Tuesday and Thursday, 10:30AM – 11:50AM Building 380, room 380C (Math building, 380C).

Contact and communication with staff

  • Staff email list: stats311-aut2324-staff appropriate symbol lists.stanford.edu

  • We will use Ed for discussion this quarter

  • We will use Gradescope for grading and problem set submissions.

Instructor

John Duchi

  • Office hours: Mondays, 3:00pm - 4:00pm, 126 Sequoia Hall.

Teaching Assistants

Rohith Kuditipudi

  • Office hours: Thursdays, 1:00pm - 3:00pm, 3rd floor of Gates Building (common area)

Chen Cheng

  • Office hours: Tuesdays, 2:30pm - 4:30pm, 207 (Bowker) Sequoia Hall.

Prerequisites

Mathematical maturity and any convex combination of Stats 300A, Stats 310A, CS229, EE276a

Description

Information theory was developed to solve fundamental problems in the theory of communications, but its connections to statistical estimation and inference date nearly to the birth of the field. With their focus on fundamental limits, information theoretic techniques have provided deep insights into optimal procedures for a variety of inferential tasks. In addition, the basic quantities of information theory–entropy and relative entropy and their generalizations to other divergence measures such as f-divergences–are central in many areas of mathematical statistics and probability.

The application of these tools are numerous. In mathematical statistics, for example, they allow characterization of optimal error probabilities in hypothesis testing, determination of minimax rates of convergence for estimation problems, demonstration of equivalence between (ostensibly) different estimation problems, and lead to penalized estimators and the minimum description length principle. In probability, they provide insights into the central limit theorem, large deviations theory (via Sanov's theorem and other results), and appear in empirical process theory and concentration of measure. Information theoretic techniques also arise in game playing, gambling, stochastic optimization and approximation, among other areas.

In this course, we will study information theoretic quantities, and their connections to estimation and statistics, in some depth, showing applications to many of the areas above. Except to provide background, we will not cover standard information-theoretic topics such as source-coding or channel-coding, focusing on the probabilistic and statistical consequences of information theory.

Grading

Your grade will be determined by approximately four problem sets (50%) and a final project (50%), with some bonuses for class participation. We reserve the right to change the grading rubric at any time.