The course outline and slides/notes/references (if any) will be provided on this page. You can find the resources from previous course iterations below.
Lecture 1, Tue Jan 12: Introductory lecture [slides]
Lecture 2, Thu Jan 14
Lecture 3, Tue Jan 19
Lecture 4, Thu Jan 21
Lecture 5, Tue Jan 26
Lecture 6, Thu Jan 28
Lecture 7, Tue Feb 2
Lecture 8, Thu Feb 4
Lecture 9, Tue Feb 9: Channel coding converse review + random coding achievability [slides]
Lecture 10, Thu Feb 11: Mert Pilanci: Polar Codes live lecture [slides] [annotated slides] [additional slides on decoding]
Lecture 11, Tue Feb 16
Lecture 12, Thu Feb 18: Rate distortion review + examples [slides]
Lecture 13, Tue Feb 23
Lecture 14, Thu Feb 25
Lecture 15, Tue Mar 2
Lecture 16, Thu Mar 4: Directed Information. References: HW1 Problem 7, estimation & application to finance, application to neuroscience
Lecture 17, Tue Mar 9: Yanjun Han  Divergences and Le Cam's twopoint method [slides] [EE378C: Informationtheoretic Lower Bounds in Data Science]
Lecture 18, Thu Mar 11: Jay Mardia  statistical vs computational notions of hardness [slides]
Lecture 19, Tue Mar 16
Lecture 20, Thu Mar 18
Winter 2020 EE 376A course material
The course outline and slides/notes/references (if any) are provided below (see introductory lecture slides for tentative course outline). The lecture videos are available on Canvas.
Lecture 1, Jan 7: Introductory lecture [slides]
Lecture 2, Jan 9: Information Measures
Lecture 3, Jan 14: Asymptotic Equipartition Property (AEP) and nearlossless compression
Lecture 4, Jan 16: Variable length compression: Huffman code, KraftMcMillan inequality
Lecture 5, Jan 21: Variable length compression: Recap, entropy as a lower bound, Shannon codes, block coding
Lecture 6, Jan 23: Shubham Chandak  [Lecture notes] Stationary processes and entropy rate (Ref: Cover & Thomas 4.1, 4.2), Universal compressors  LZ77 (Ref: C&T 13.4, 13.5), Application to genomic data compression [Slides][Paper]. Additional resources on convergence of LZ [EE376C notes].
Lecture 7, Jan 28: Reliable communication I: channel capacity, examples
Lecture 8, Jan 30: Reliable communication II: channel capacity theorem, Fano's inequality
Lecture 9, Feb 4: Reliable communication III: channel coding converse
Lecture 10, Feb 6: Mert Pilanci: Polar Codes [slides] [annotated slides] [additional slides on decoding]
Lecture 11, Feb 11: Information measures for Continuous RVs, AWGN channel
Lecture 12, Feb 13: Lossy compression I: ratedistortion function, examples
Lecture 13, Feb 18: Lossy compression II: intuition, converse
Lecture 14, Feb 20: Lossy compression II: joint typicality, achievability
Lecture 15, Feb 25: Joint sourcechannel coding and the separation theorem
Lecture 16, Feb 27: Kedar Tatwawadi  Information Theory meets Machine Learning [slides]
Lecture 17, Mar 3: Yanjun Han  Informationtheoretic Lower Bounds [slides]
Lecture 18, Mar 5: Meltem Tolunay  Quantum Information Theory: Preliminaries, Superdense coding, the CHSH game [notes] [additional reading]
Lecture 19, Mar 10: Irena FischerHwang  Image Compression: From theory to practice [slides], Additional resources: [GIF], [PNG 1, PNG 2], [JPEG], [Human compression]
Lecture 20, Mar 12: Dmitri Pavlichin  Genomic and tabular data compression + sundry IT adventures [slides], Additional resources on genome compression: [IEEE Spectrum][Bioinformatics]
Winter 2018 EE 376A course material
The lecture notes from winter 2018 are provided below and the lecture videos recorded by SCPD are available on Canvas. The timestamps connecting the topics to the lecture video are available here. The textbook used was Elements of Information Theory.
Jan 9: Introduction to Information Theory I
Jan 11: Introduction to Information Theory II
Jan 16: Information Measures
Jan 18: Asymptotic Equipartition Property (AEP)
Jan 23: Variablelength Lossless Compression
Jan 25: KraftMcMillan Inequality and Huffman Coding
Jan 30: Optimality of Huffman Codes, Communication and Channel Capacity
Feb 1: Channel Capacity, Information measures for Continuous RVs
Feb 6: AWGN channel, Joint AEP
Feb 8: Channel Coding Theorem: Direct Part
Feb 13: Channel Coding Theorem: Converse Part
Feb 15: Lossy Compression and Rate Distortion Theory
Feb 20: Method of Types
Feb 22: Sanov's Theorem
Feb 27: Strong, Conditional and Joint Typicality
Mar 1: Strongly Typical Sequences and Rate Distortion
Mar 6: Strongly Typical Sequences and Rate Distortion 2
Mar 8: Joint SourceChannel Coding
Mar 13: Joint SourceChannel Coding 2, Slides
Mar 15: Information Theory in Machine Learning
