EE278: Lecture Schedule

David Tse, Stanford University, Autumn 2020
  1. Introduction. (1 lecture)

  2. Two useful theorems from basic probability (1 lecture)

    1. Law of large numbers and central limit theorem.

  3. Jointly Gaussian random vectors and their properties. (2 lectures) - Reading : Chapter 3.1 - 3.5 of text

  4. Hypothesis testing and detection (3 lectures) - Reading : Chapter 8.1 - 8.5 in text

    1. Maximum likelihood (ML), maximum a posterior probability (MAP), and Bayes criteria.

    2. Likelihood ratios, Neyman-Pearson test.

  5. Estimation (7 lectures)

    1. Minimum mean-square (MMSE) and linear least square estimation, orthogonality principle.

    2. Recursive estimation, Kalman filtering.

    3. Parameter estimation, Cramer-Rao bound.

    4. Sparsity and compressed sensing.

  6. Introduction to stochastic processes (6 lectures)

    1. Basic concepts: stationarity, ergodicity, independent increments, Markovian.

    2. Examples: Poisson process, Gaussian processes, Wiener process, white noise.

    3. Second order statistics: covariance function, power spectrum density.

    4. Transformation by LTI systems.

    5. Smoothing and causal Wiener filtering of stationary random processes.