EE 276: Course Outline

Stanford University, Tsachy Weissman, Winter Quarter 2023-24
  • Lecture 1, Jan 9: Introductory lecture.

  • Lecture 2, Jan 11: Information measures: entropy (joint, relative and condition).

    • [notes]

    • Similar in coverage to 2020 Video Lecture 2 (on canvas).

  • Lecture 3, Jan 16: Asymptotic Equipartition Property (AEP) and typicality.

    • [notes]

    • Similar in coverage to 2020 Video Lecture 3 (on canvas).

  • Lecture 4, Jan 18: Variable length lossless compression: prefix and Shannon codes

  • Lecture 5, Jan 23: The Kraft-McMillan inequality and Huffman coding

  • Lecture 6, Jan 25: Entropy rates and universal compression

    • Lecture video: 1/23 of Winter 2020 EE 276 on Canvas.

  • Lecture 7, Jan 30: Reliable communication and channel capacity

  • Lecture 8, Feb 1: Information measures for continuous random variables

  • Lecture 9, Feb 6: AWGN channel

  • Lecture 10, Feb 8: Channel coding theorem converse

  • Lecture 11, Feb 13: Joint AEP and channel coding theorem

  • Lecture 12, Feb 15: Polar codes

  • Lecture 13, Feb 20: Lossy Compression and Rate Distortion

  • Lecture 14, Feb 22: Lossy Compression and Rate Distortion Continued

  • Lecture 15, Feb 27: Method of Types

  • Lecture 16, Feb 29: Strong, conditional and joint typicality

  • Lecture 17, Mar 5: Joint source-channel coding and the separation theorem

  • Lecture 18, Mar 7: Joint source-channel coding and the separation theorem 2