EE376A/Stats376A: Information Theory
David Tse(firstname.lastname@example.org) Winter 2016-2017
Information theory was invented by Claude E. Shannon in 1948 as a mathematical theory for communication but has subsequently found a broad range of applications. The first two-thirds of the course cover the core concepts of information theory, including entropy and mutual information, and how they emerge as fundamental limits of data compression and communication. The last one-third of the course focuses on applications of information theory to statistics and machine learning.