EE 378A: Syllabus

Here is a rough syllabus (precise schedule will depend on the progress in class, and suggestions/feedback are welcome):

  1. Statistical Decision Theory

    1. basic settings and concepts (loss function, risk, admissibility, etc.)

    2. Bayes and minimax settings, minimax theorem

  2. Bayes Theory

    1. introduction, computation of posterior

    2. state estimation in hidden Markov process, forward-backward recursion, Viterbi algorithm

    3. approximate inference, particle filters

  3. Information-theoretic Functionals

    1. introduction of f-divergences, with properties and applications

    2. variational representation of f-divergences

    3. applications to statistical estimation: Cramer-Rao bound, minimax lower bound

    4. mutual information, its significance and properties

  4. Sequential Decision Making

    1. general setting

    2. prediction under logarithmic loss and general loss, CTW algorithm

    3. applications: filtering, denoising, directed information estimation

  5. Estimation of High-dimensional Functionals

    1. concentration of measure phenomenon

    2. optimal estimation of Shannon entropy, mutual information with applications

    3. bias analysis via K-functional, bias reduction via bootstrap, jackknife and Taylor expansion

  6. Nonparametric Estimation

    1. discrete universal denoiser (DUDE)

    2. nonparametric function estimation: bias-variance tradeoff, wavelet shrinkage

    3. nonparametric functional estimation: bias reduction

  7. Statistical Learning Theory

    1. introduction and key differences from decision theory

    2. VC-type inequalities

  8. Miscellaneous

    1. estimating the fundamental limits vs. achieving the fundamental limits

    2. further topics according to remaining time and student interests