EE 276: Course Outline

Stanford University, Tsachy Weissman, Winter Quarter 2020-21

The course outline and slides/notes/references (if any) will be provided on this page. You can find the resources from previous course iterations below.

Winter 2020 EE 376A course material

The course outline and slides/notes/references (if any) are provided below (see introductory lecture slides for tentative course outline). The lecture videos are available on Canvas.

  • Lecture 1, Jan 7: Introductory lecture [slides]

  • Lecture 2, Jan 9: Information Measures

  • Lecture 3, Jan 14: Asymptotic Equipartition Property (AEP) and near-lossless compression

  • Lecture 4, Jan 16: Variable length compression: Huffman code, Kraft-McMillan inequality

  • Lecture 5, Jan 21: Variable length compression: Recap, entropy as a lower bound, Shannon codes, block coding

  • Lecture 6, Jan 23: Shubham Chandak - [Lecture notes] Stationary processes and entropy rate (Ref: Cover & Thomas 4.1, 4.2), Universal compressors - LZ77 (Ref: C&T 13.4, 13.5), Application to genomic data compression [Slides][Paper]. Additional resources on convergence of LZ [EE376C notes].

  • Lecture 7, Jan 28: Reliable communication I: channel capacity, examples

  • Lecture 8, Jan 30: Reliable communication II: channel capacity theorem, Fano's inequality

  • Lecture 9, Feb 4: Reliable communication III: channel coding converse

  • Lecture 10, Feb 6: Mert Pilanci: Polar Codes [slides] [annotated slides] [additional slides on decoding]

  • Lecture 11, Feb 11: Information measures for Continuous RVs, AWGN channel

  • Lecture 12, Feb 13: Lossy compression I: rate-distortion function, examples

  • Lecture 13, Feb 18: Lossy compression II: intuition, converse

  • Lecture 14, Feb 20: Lossy compression II: joint typicality, achievability

  • Lecture 15, Feb 25: Joint source-channel coding and the separation theorem

  • Lecture 16, Feb 27: Kedar Tatwawadi - Information Theory meets Machine Learning [slides]

  • Lecture 17, Mar 3: Yanjun Han - Information-theoretic Lower Bounds [slides]

  • Lecture 18, Mar 5: Meltem Tolunay - Quantum Information Theory: Preliminaries, Super-dense coding, the CHSH game [notes] [additional reading]

  • Lecture 19, Mar 10: Irena Fischer-Hwang - Image Compression: From theory to practice [slides], Additional resources: [GIF], [PNG 1, PNG 2], [JPEG], [Human compression]

  • Lecture 20, Mar 12: Dmitri Pavlichin - Genomic and tabular data compression + sundry IT adventures [slides], Additional resources on genome compression: [IEEE Spectrum][Bioinformatics]

Winter 2018 EE 376A course material

The lecture notes from winter 2018 are provided below and the lecture videos recorded by SCPD are available on Canvas. The timestamps connecting the topics to the lecture video are available here. The textbook used was Elements of Information Theory.

  • Jan 9: Introduction to Information Theory I

  • Jan 11: Introduction to Information Theory II

  • Jan 16: Information Measures

  • Jan 18: Asymptotic Equipartition Property (AEP)

  • Jan 23: Variable-length Lossless Compression

  • Jan 25: Kraft-McMillan Inequality and Huffman Coding

  • Jan 30: Optimality of Huffman Codes, Communication and Channel Capacity

  • Feb 1: Channel Capacity, Information measures for Continuous RVs

  • Feb 6: AWGN channel, Joint AEP

  • Feb 8: Channel Coding Theorem: Direct Part

  • Feb 13: Channel Coding Theorem: Converse Part

  • Feb 15: Lossy Compression and Rate Distortion Theory

  • Feb 20: Method of Types

  • Feb 22: Sanov's Theorem

  • Feb 27: Strong, Conditional and Joint Typicality

  • Mar 1: Strongly Typical Sequences and Rate Distortion

  • Mar 6: Strongly Typical Sequences and Rate Distortion 2

  • Mar 8: Joint Source-Channel Coding

  • Mar 13: Joint Source-Channel Coding 2, Slides

  • Mar 15: Information Theory in Machine Learning