EE 276: Course Outline

Stanford University, Tsachy Weissman, Winter Quarter 2025-26

Assigned Viewing

We will have assigned viewing of a few lecture videos from an earlier offering of the course prior to each in-person lecture on Tuesdays. These lecture videos can be found in the “Winter 2025 - EE276 Archived Lectures” folder on Canvas under the Panopto Course Videos tab. We will keep updating the assigned viewing for each in-person lecture below:

  • Week 2, 01-13-2026: EE276 on 1_9_2025, EE276 on 1_14_2025 (upto 23:41)

  • Week 3, 01-20-2026: EE276 on 1_14_2025 (from 23:41), EE276 on 1_16_2025, EE276 on 1_21_2025 (upto 32:00)

  • Week 4, 01-27-2026: EE276 on 1_21_2025 (from 32:00), EE276 on 1_23_2025

Estimated Outline

  • Lecture 1: Introductory lecture

  • Lecture 2: Information measures: entropy (joint, relative and condition)

    • Corresponds to Elements of Information Theory Chapter 2

  • Lecture 3: Asymptotic Equipartition Property (AEP) and typicality

    • Corresponds to Elements of Information Theory Chapter 3

  • Lecture 4: Variable length lossless compression: prefix codes

    • Corresponds to Elements of Information Theory Chapter 5

  • Lecture 5: Shannon code, the Kraft-McMillan inequality and Huffman code

    • Corresponds to Elements of Information Theory Chapter 5

  • Lecture 6: Entropy rates and universal compression

    • Corresponds to Elements of Information Theory Chapter 4

  • Lecture 7: Reliable communication and channel capacity

    • Corresponds to Elements of Information Theory Chapter 6

  • Lecture 8: Information measures for continuous random variables

    • Corresponds to Elements of Information Theory Chapter 8

  • Lecture 9: AWGN channel

    • Corresponds to Elements of Information Theory Chapter 9

  • Lecture 10: Joint AEP and channel coding theorem (direct part)

    • Corresponds to Elements of Information Theory Chapter 6

  • Lecture 11: Proof of channel coding theorem (direct part)

    • Corresponds to Elements of Information Theory Chapter 6

  • Lecture 12: Proof of channel coding theorem (converse part)

    • Corresponds to Elements of Information Theory Chapter 6

  • Lecture 13: Polar codes

    • Corresponds to Elements of Information Theory Chapter 7

  • Lecture 14: Lossy compression and rate distortion

    • Corresponds to Elements of Information Theory Chapter 10

  • Lecture 15: Lossy compression and rate distortion continued

    • Corresponds to Elements of Information Theory Chapter 10

  • Lecture 16: Converse part of rate distortion theory

    • Corresponds to Elements of Information Theory Chapter 10

  • Lecture 17: Method of types

    • Corresponds to Elements of Information Theory (C&T) Chapter 11

  • Lecture 18: Strong and joint typicality, direct part of rate distortion theory

    • Corresponds to Elements of Information Theory Chapter 10