EE 376A: Information Theory

Stanford University, Tsachy Weissman, Winter Quarter 2014-15


  • Final exam and solutions are now posted. The final grades have been released.

  • Homework 7 has been released (due on March 10th).

  • 02-22-2015: Having finished the main topic of communication and channel capacity (for now, we will get back to it toward end of the quarter), our next main topic is lossy compression and the rate distortion function. Chapter 10 of the Cover and Thomas book is good reading for this material, with the usual caveat that our treatment will have different ordering of the material, and some of our techniques and approaches to the proof of the main result will differ in ways we believe are constructive and ease the digestion of the material. The last lecture (on Thursday) and the coming one (this Tuesday) are dedicated to equipping ourselves with statistical concepts of types and typicality (beyond those from beginning of the quarter), that will allow us to hit the ground running when we get to the actual material on lossy compression. The method of types, which we covered on the last lecture, is covered in Section 11.1 of the book. The coming lecture will be dedicated to the notion of strong typicality. This is covered in Section 10.6 of the book, although we will adopt a slightly different treatment, which is covered in Sections 2.4 and 2.5 of the book: Abbas El Gamal and Young-Han Kim, Network Information Theory, Cambridge University Press, 2012.

  • Material for midterm: Up to and including lecture 8 (Feb. 3, 2015), and hw 3.

  • Practice Midterms are up! Enjoy

  • Solutions to Homeworks 1-3 have been posted.

  • Homework 4 has been released (due on Feb 12th).

  • Students are required to scribe ONE lecture each. Each student should enter their name in front of the lecture they wish to scribe here. The lecture scribe template can be downloaded here.

  • TA Office Hours have been updated

  • We will use Coursework for grade records. Please go to Coursework to look up your homework/midterm/final grades.

  • We have set up a forum in Piazza. You can sign up as a student here.

Course Overview

Information theory is the science of operations on data such as compression, storage, and communication. It is among the few disciplines fortunate to have a precise date of birth: 1948, with the publication of Claude E. Shannon's paper entitled “A Mathematical Theory of Communication”.

Our course will explore the basic concepts of Information theory. It is a prerequisite for research in this area, and highly recommended for students planning to delve into the fields of communications, data compression, and statistical signal processing. The intimate acquaintance that we will gain with measures of information and uncertainty - such as mutual information, entropy, and relative entropy - would be invaluable also for students, researchers, and practitioners in fields ranging from neuroscience to machine learning. Also encouraged to enroll are students of statistics and probability, who will gain an appreciation for the interplay between information theory, combinatorics, probability, and statistics.