Books
Sections of the following books may be useful in parts of the course.
Charles J. Stone, An Asymptotically Optimal Window Selection Rule for Kernel Density Estimates, Ann. Statist. Vol. 12, Number 4 (1984), 12851297. Download pdf
Larry Wasserman, All of Nonparametric Statistics, Springer, 2006. Download pdf
László Györfi, Michael Kohler, Adam Krzy and Harro A Distributionfree Theory of Nonparametric Regression, 2010. Download pdf
T. Kailath, A. H. Sayed, and B. Hassibi, Linear Estimation, Prentice Hall, NJ, 2000.
J. M. Bernardo and A. F. M. Smith, Bayesian Theory, Wiley, 2000.
E. L. Lehmann, Theory of Point Estimation, Springer, 2nd ed. 1998.
T. M. Cover and J. A. Thomas, Elements of Information Theory, John Wiley, 1991.
D. J. C. MacKay, Information Theory, Inference, and Learning Algorithms, Cambridge University Press, UK, 2003.
N. CesaBianchi, and G. Lugosi, Prediction, Learning, and Games, Cambridge University Press, New York, 2006.
The following books may be conferred for further reading on some of the topics we will touch.
Classical statistical signal processing and time series analysis
H. Vincent Poor, An Introduction to Signal Detection and Estimation, Springer, 1994
B. Porat, Digital Processing of Random Signals, PrenticeHall, 1994.
A. Papoulis, Probability, Random Variables and Stochastic Processes, 3rd ed., McGrawHill, 1991.
Brockwell and Davis, Time series: theory and methods, 2nd ed., Springer 1991.
J. L Doob, Stochastic Processes, John Wiley & Sons, 1953.
L. D. Davisson and R. M. Gray, Introduction to Statistical Signal Processing, Cambridge University Press, 2009.
W. A. Gardner, Introduction to Random Processes: with Applications to Signals and Systems, 2nd ed., McGrawHill, 1990.
H. Stark and J. W. Woods, Probability and Random Processes, with Applications to Signal Processing, PrenticeHall, 3rd ed., 2001.
C. W. Therrien, Discrete Random Signals and Statistical Signal Processing, PrenticeHall, 1992.
H. L. Van Trees, Detection, Estimation and Modulation Theory, Part I, Wiley, 1968.
Related to more modern signal processing
A. Doucet, N. de Freitas, and N. J. Gordon, eds., Sequential Monte Carlo Methods in Practice, Springer, New York, 2001.
F. V. Jensen and T. D. Nielsen, Bayesian Networks and Decision Graphs, 2nd ed., Springer, 2007.
L. Lauritzen, Graphical Models, Clarendon Press, Oxford, UK, 1996.
B. Ristic, S. Arulampalam, and N. Gordon, Beyond the Kalman Filter: Particle Filters for Tracking Applications, Artech House, Boston, 2004.
S. Thrun, W. Burgard and D. Fox, Probabilistic Robotics, MIT Press, 2005.
R. G. Cowell, A. P. Dawid, S. L. Lauritzen, and D. J. Spiegelhalter, Probabilistic Networks and Expert Systems, Springer, New York, 1999.
D. Koller and N. Friedman, Probabilistic Graphical Models: Principles and Techniques, MIT Press, 2009.
Xavier Guyon, Random fields on a network: Modeling, Statistics and Applications, Probability and its Applications, SpringerVerlag, New York, 1995.
Miscellaneous
Luc Devroye, Gabor Lugosi and Laszlo Gyorfi, A Probabilistic Theory of Pattern Recognition, SpringerVerlag, New York, 1996.
Luc Devroye and Gabor Lugosi, Combinatorial Methods in Density Estimation, New York: Springer, 2001.
P. Bremaud, Markov Chains, Gibbs Fields, Monte Carlo Simulation, and Queue, Springer, New York, 1999.
R. M. Gray, Toeplitz and Circulant Matrices: A Review, revised 2006.
M. Mezard and A. Montanari, Information, Physics, and Computation, Oxford University Press, Inc., New York, 2009.
Previous courses
Papers
P. Massart, “The Tight Constant in the DvoretzkyKieferWolfowitz Inequality,”Ann. Probab. Volume 18, Number 3 (1990), 12691283. Download pdf
Hidden Markov models
Y. Eprahim and N. Merhav, “Hidden Markov Processes,” IEEE Trans. on IT, vol. 48, vol. 6, Jun. 2002.
G. D. Forney, Jr., “The Viterbi Algorithm,” Proc. IEEE, vol. 61, no. 3, pp. 268278, Mar. 1973.
L. R. Rabiner, “A Tutorial on Hidden Markov Models and Selected Applications in Speech Recognition,” Proc. IEEE, vol. 77, no. 2, pp. 257286, Feb. 1989.
Graphical models
B. J. Frey and N. Jojic, “A Comparison of Algorithms for Inference and Learning in Probabilistic Graphical Models,” IEEE Trans. Pattern Anal. Machine Intell., vol. 27, no. 9, pp. 13921416, Sept. 2005.
M. I. Jordan, “Graphical Models,” Statist. Sci., vol. 19, no. 1, pp. 140155, 2004.
H.A. Loeliger, J. Dauwels, J. Hu, S. Korl, L. Ping, and F. R. Kschischang, “The Factor Graph Approach to ModelBased Signal Processing,” Proc. IEEE, vol. 95, no. 6, pp. 12951322, June 2007.
J. S. Yedidia, W. T. Freeman, and Y. Weiss, “Understanding Belief Propagation and its Generalizations,” Exploring Artificial Intelligence in the New Millenium, Science and Technology Books, Jan. 2003.
Particle and approximate nonlinear Filtering
S. Arulampalam, S. Maskell, N. Gordon and T. Clapp, “A Tutorial on Particle Filters for Online Nonlinear/NonGaussian Bayesian Tracking,” IEEE Trans. on SP, vol. 50, 2001.
D. Brigo, B. Hanzon, F. Le Gland. “Approximate Nonlinear Filtering by Projection on Exponential Manifolds of Densities,” Bernoulli, vol. 5, no. 3 (1999), pp. 495534.
Prediction, filtering, denoising of individual sequences
T. Weissman, E. Ordentlich, G. Seroussi, S. Verdu and M. J. Weinberger, “Universal discrete denoising: known channel,” IEEE Trans. on IT, vol. 51(1), Jan 2005.
E. Ordentlich, G. Seroussi, S. Verdu, M. Weinberger and T. Weissman, “Reflections on the DUDE,” IEEE Information Theory Society Newsletter, vol. 57, no. 2, pp. 510, June 2007 (invited).
T. Weissman, E. Ordentlich, M. Weinberger, A. SomekhBaruch and N. Merhav, “Universal Filtering via Prediction,” IEEE Trans. on IT, vol. 53, no. 4, pp. 1253  1264, April 2007.
Some oldies
A. Papoulis, “Predictable Processes and Wold's Decomposition: A Review,” IEEE Trans. on ASSP, vol. 33, no. 4, Aug 1985.
R. E. Kalman, “A New Approach to Linear Filtering and Prediction Problems,” Trans. ASMEJ. Basic Eng., vol. 82, series D, pp. 3545, 1960.
S. Geman and D. Geman (1984). “Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images,” IEEE Trans. on Pattern Analysis and Machine Intelligence, vol. 6, pp. 721741, 1984.
