EE376A/Stats376A: Information Theory

David Tse(dntse@stanford.edu) Winter 2016-2017

Course Description

Information theory was invented by Claude E. Shannon in 1948 as a mathematical theory for communication but has subsequently found a broad range of applications. The first two-thirds of the course cover the core concepts of information theory, including entropy and mutual information, and how they emerge as fundamental limits of data compression and communication. The last one-third of the course focuses on applications of information theory to statistics and machine learning.

Lectures

  • Tue, Thu 12:00-1:20 pm at Hewlett Teaching Center 201.

Announcement

  • Homework 1 is out. It is due on Thursday January 19, 5pm.