Here is a rough syllabus (precise schedule will depend on the progress in class, and suggestions/feedback are welcome):
Statistical Decision Theory
basic settings and concepts (loss function, risk, admissibility, etc.)
Bayes and minimax settings, minimax theorem
Bayes Theory
introduction, computation of posterior
state estimation in hidden Markov process, forwardbackward recursion, Viterbi algorithm
approximate inference, particle filters
Informationtheoretic Functionals
introduction of fdivergences, with properties and applications
variational representation of fdivergences
applications to statistical estimation: CramerRao bound, minimax lower bound
mutual information, its significance and properties
Sequential Decision Making
general setting
prediction under logarithmic loss and general loss, CTW algorithm
applications: filtering, denoising, directed information estimation
Estimation of Highdimensional Functionals
concentration of measure phenomenon
optimal estimation of Shannon entropy, mutual information with applications
bias analysis via Kfunctional, bias reduction via bootstrap, jackknife and Taylor expansion
Nonparametric Estimation
discrete universal denoiser (DUDE)
nonparametric function estimation: biasvariance tradeoff, wavelet shrinkage
nonparametric functional estimation: bias reduction
Statistical Learning Theory
introduction and key differences from decision theory
VCtype inequalities
Miscellaneous
estimating the fundamental limits vs. achieving the fundamental limits
further topics according to remaining time and student interests
