Jiantao Jiao

photo_jiantao 

Jiantao Jiao
Ph.D. Candidate
Department of Electrical Engineering
Stanford University
Advisor: Prof. Tsachy Weissman
Co-Advisor: Prof. Andrea Montanari

Contact

Email: jiantao [at] stanford [dot] edu

Packard Building, Room No. 251
350 Serra Mall
Stanford, CA 94305

News

  • In Spring 2017 I am co-teaching EE378A again with my advisor! Building on the materials in the EE378A offering in Spring 2016, this year's materials will be more topic centered than framework centered. We will demonstrate how to utilize information theoretic methods in machine learning, discuss recent breakthroughs in information measure estimation, reveal the (surprising) phenomenon of estimating fundamental limits is easier than achieving fundamental limits, demonstrate the underlying mathematics for the general phenomenon of effective sample size enlargement in information measure estimation, and introduce the powerful Peetre's K-functional and related approximation theoretic quantities to help with statistical analysis. We will also treat the denoising problem from both learning and decision theoretic perspectives and investigate their relative strengthes. We will also talk about general Bayes and minimax decision theory, learning theory, and certain powerful algorithms induced by the theory that has been widely deployed in practice.

  • Our paper ‘‘Minimax Rate-optimal Estimation of KL Divergence between Discrete Distributions’’ wins the Student Paper Award at ISITA 2016!

  • Our paper ‘‘Minimax Estimation of the L_1 Distance’’ is in the Finalist for the Jack Keil Wolf Student Paper Awards at ISIT 2016!

  • In Spring 2016 I am co-teaching EE378A with my advisor. This year's EE378A offering has been substantially revised to reflect modern advances in the mathematics of information processing, as well as detailed comparison and analysis of existing frameworks for data analysis.

  • Our paper ‘‘Maximum Likelihood Estimation of Information Measures’’ is in the semi-plenary sessions of ISIT 2015!

  • My talk at the Workshop on Information Theory, Learning and Big Data at the Simons Institute for Theory of Computing

  • The Han–Jiao–Weissman (HJW) Kullback–Leibler (KL) divergence estimator released! Code website

  • The version 3.0 Jiao-Venkat-Han-Weissman (JVHW) entropy, Renyi entropy, and mutual information estimators released! Code website

  • The Matlab code for universal estimation of directed information released! Code website

Education

  • Ph.D. Candidate, Department of Electrical Engineering, Stanford University, Jan. 2013 - Present

  • M.Sc., Department of Electrical Engineering, Stanford University, Sept. 2012 - June 2014

  • B.E., Department of Electronic Engineering, Tsinghua University, Beijing, China, August 2008 - July 2012

Research Interests

  • High Dimensional Statistics and Nonparametric Statistics

  • Statistical Machine Learning

  • Information Theory

  • Optimization

Professional Activities

  • Reviewer for IEEE Transactions on Information Theory, IEEE Transactions on Signal Processing, IEEE Transactions on Signal and Information Processing over Networks, Entropy, IEEE Statistical Signal Processing Workshop (SSP), Conference on Learning Theory (COLT), International Symposium on Information Theory (ISIT), IEEE Information Theory Workshop (ITW), Symposium on Discrete Algorithms (SODA)