Universal Estimation of Directed Information and its Applications

Jiantao Jiao
Graduate Student, Stanford University
Given on: Nov. 9th, 2012


Directed Information is an information-theoretic quantity defined for a pair of jointly distributed sequences, which is often a natural measure of the extent to which one sequence is relevant for causal inference on the other. It first appeared in the context of feedback communications, and was subsequently found useful in identifying and measuring causal relevance in neurological, biological and financial data. The well-known measure --Granger Causality-- is one special case, as it is the manifestation of directed information under certain (Gaussian and linear) model assumptions.

In this talk, we will briefly go over the history of causal inference, explain why it is very important in practice and how to build up a unified framework incorporating other causality measures proposed in history. Then we will introduce our recent work on directed information, which arises as a natural choice for a unified measure of causality. We will show how to develop optimal estimators to estimate directed information in practice. At last, we will demonstrate some experimental results, including the causal relationships between stock markets in China and U.S., and how to infer the mechanic wave propagation pattern inside buildings from sensor recordings.