Stat 375 : Syllabus

Here is a rough syllabus (changes are possible, and suggestions/feedback are welcome).

April 2, 4

Equivalent graphical representations. Markov property and Hammersley-Clifford theorem. Applications of graphical models to computer vision, statistical signal processing, statistical inference.

April 9, 11

Polynomial reductions between various probabilistic inference tasks. Computational hardness. One dimensional models: Hidden Markov models, Viterbi and BCJR algorithms.

April 16, 18

Models on trees. Belief propagation.

April 25, 30, May 2, 7

Variational inference: naive mean field, Bethe free energy, generalized BP and convex relaxations.

May 9

Gaussian graphical models. Exact inference. Convergence of belief propagation.

May 14, 16

Learning graphical models: structural learning and parameter learning.

May 21, 23

Correlation decay. The Markov Chain Monte Carlo method. Relation with message passing algorithms.

May 30, June 4, 6

Applications to clustering and classification.


More on correlation decay and computational hardness for learning and inference.

Homeworks will be assigned on April 2, 9, 16, 23, 30, May 7, 14, 21. They are due one week after they are assigned.

A 24-hour take-home final will be assigned on June 6.