Some relations between Information and Estimation
I will give a tour through a sparse and biased sample of the information theory literature - both classical and recent - on relations between information and estimation. Beyond aesthetic value, these relations underlie some of the main tools in Shannon theory, such as the Entropy Power Inequality. They also give considerable insight into and a quantitative understanding of several estimation theoretic objects, such as the costs of causality and of mismatch, as well as the performance and structure of minimax estimators. Further, they enable the transfer of analytic tools and algorithmic know-how from one domain to another. Unabashedly, focus will be on work in which I was involved. Even to that subset of the literature I will not do justice, but future presentations by other forum members will.