The Han–Jiao–Weissman (HJW) Kullback–Leibler (KL) divergence estimator

What is KL divergence?

The KL divergence is an information-theoretic measure introduced by Kullback and Leibler in 1951, and quantifies the discrepancy between two information sources or random variables. The KL divergence plays significant roles in information theory and various disciplines such as statistics, machine learning, physics, neuroscience, computer science, linguistics, etc.

What can our software do?

Our software comprises of Matlab, Python 2.7(3) packages that can estimate the KL divergence between two discrete distributions from two jointly independent sample sequences drawn from these distributions. For details about how it works, please take a look at our paper “Minimax Rate-Optimal Estimation of Divergences between Discrete Distributions” by Yanjun Han, Jiantao Jiao and Tsachy Weissman in 2016. For details about how to use it in Matlab or Python, please checkout our Github repo.