Introduction. (1 lecture)
Two useful theorems from basic probability (1 lecture)
Law of large numbers and central limit theorem.
Jointly Gaussian random vectors and their properties. (2 lectures)  Reading : Chapter 3.1  3.5 of text
Hypothesis testing and detection (3 lectures)  Reading : Chapter 8.1  8.5 in text
Maximum likelihood (ML), maximum
a posterior probability (MAP), and Bayes
criteria.
Likelihood ratios, NeymanPearson test.
Estimation (7 lectures)
Minimum meansquare (MMSE) and linear least square estimation,
orthogonality principle.
Recursive estimation, Kalman filtering.
Parameter estimation, CramerRao bound.
Sparsity and compressed sensing.
Introduction to stochastic processes (6 lectures)
Basic concepts: stationarity,
ergodicity, independent increments, Markovian.
Examples: Poisson
process, Gaussian processes, Wiener process, white noise.
Second order
statistics: covariance function, power spectrum density.
Transformation by LTI systems.
Smoothing and causal Wiener filtering of stationary random processes.
