Papers

Page under construction. Manuscripts linked to are not necessarily the final published versions. A more up-to-date list can be found here. Please email me if you would like a pdf of any paper you are not able to access online.

Recent Papers

Beyond Maximum Likelihood: from Theory to Practice
Jiantao Jiao, Kartik Venkat, Yanjun Han, Tsachy Weissman
Rateless Lossy Compression via the Extremes
Albert No, Tsachy Weissman
Minimax Estimation of Functionals of Discrete Distributions
Jiantao Jiao, Kartik Venkat, Yanjun Han, Tsachy Weissman
Non-asymptotic Theory for the Plug-in Rule in Functional Estimation
Jiantao Jiao, Kartik Venkat, Tsachy Weissman
Relations between Information and Estimation in Scalar Lévy Channels
Jiantao Jiao, Kartik Venkat, Tsachy Weissman
Information Measures: the Curious Case of the Binary Alphabet
Jiantao Jiao, Thomas Courtade, Albert No, Kartik Venkat, Tsachy Weissman
Justification of Logarithmic Loss via the Benefit of Side Information
Jiantao Jiao, Thomas Courtade, Kartik Venkat, and Tsachy Weissman
Compression Schemes for Similarity Queries
Idoia Ochoa, Amir Ingber and Tsachy Weissman
The Minimal Compression Rate for Similarity Identification
Amir Ingber and Tsachy Weissman
Unsupervised Learning and Universal Communication
Vinith Misra and Tsachy Weissman
Comparison of the Achievable Rates in OFDM and Single Carrier Modulation with I.I.D. Inputs
Yair Carmon, Shlomo Shamai, Tsachy Weissman
Compression for Quadratic Similarity Queries
Amir Ingber, Thomas Courtade, and Tsachy Weissman
Complexity and Rate-Distortion Tradeoff via Successive Refinement
Albert No, Amir Ingber and Tsachy Weissman
Capacity of a POST Channel with and without Feedback
Haim H. Permuter, Himanshu Asnani, and Tsachy Weissman
Network Compression: Worst-Case Analysis
Himanshu Asnani, Ilan Shomorony, A. Salman Avestimehr, and Tsachy Weissman
Minimax Filtering via Relations between Information and Estimation
Albert No and Tsachy Weissman
Information, Estimation, and Lookahead in the Gaussian Channel
Kartik Venkat, Tsachy Weissman, Yair Carmon, and Shlomo Shamai
Achievable Complexity-Performance Tradeoffs in Lossy Compression
Ankit Gupta, Sergio Verdu, and Tsachy Weissman
The Porosity of Additive Noise Sequence
Vinith Misra and Tsachy Weissman
Reference Based Genome Compression
B.G. Chern, I. Ochoa, A. Manolakos, A. No, K. Venkat, and T. Weissman
Succesive Refinement with Decoder Cooperation and its Channel Coding Duals
Himanshu Asnani, Haim Permuter, and Tsachy Weissman
Estimation with a helper who knows the interference
Yeow-Khiang Chia, Rajiv Soundararajan, and Tsachy Weissman
Universal Estimation of Directed Information
DIcode is available
Jiantao Jiao, Haim H. Permuter, Lei Zhao, Young-Han Kim, and Tsachy Weissman
Achievable Error Exponents in the Gaussian Channel with Rate-Limited Feedback
Reza Mirghaderi, Andrea Goldsmith and Tsachy Weissman
Pointwise Relations between Information and Estimation
Kartik Venkat and Tsachy Weissman

Information and Estimation

The Relationship Between Causal and Non-Causal Mismatched Estimation in Continuous-Time AWGN Channels

Tsachy Weissman

Mutual Information, Relative Entropy, and Estimation in the Poisson Channel
Rami Atar and Tsachy Weissman
Pointwise Relations between Information and Estimation
Kartik Venkat and Tsachy Weissman

Action in Information

Compression with Actions
Lei Zhao, Yeow-Khiang Chia, and Tsachy Weissman
Capacity of Channels with Action-Dependent States
Tsachy Weissman
Source Coding with a Side Information 'Vending Machine'

Tsachy Weissman and Haim Permuter

Probing Capacity
Himanshu Asnani, Haim Permuter, and Tsachy Weissman
Multi-terminal Source Coding With Action Dependent Side Information
Yeow-Khiang Chia, Himanshu Asnani, and Tsachy Weissman

Multi-terminal Source Coding

Two-way Source Coding with a Helper
Haim Permuter, Yossef Steinberg, and Tsachy Weissman
Cascade, Triangular and Two Way Source Coding with degraded side information at the second user
Yeow Khiang Chia, Haim Permuter, and Tsachy Weissman
Cascade and Triangular Source Coding with Side Information at the First Two Nodes

Haim Permuter and Tsachy Weissman

Multiterminal Source Coding under Logarithmic Loss
Thomas Courtade and Tsachy Weissman

Feedback Communication

Coding for the feedback Gel'fand-Pinsker channel and the feedforward Wyner-Ziv source

Neri Merhav and Tsachy Weissman

Capacity of the Trapdoor Channel with Feedback

Haim Permuter, Paul Cuff, Benjamin Van Roy, and Tsachy Weissman

Coding Schemes for Additive White Noise Channels with Feedback Corrupted by Quantization or Bounded Noise

Nuno Martins and Tsachy Weissman

Finite-state channels with time invariant deterministic feedback

Haim Permuter, Tsachy Weissman, and Andrea Goldsmith

Capacity Region of the Finite State Multiple Access Channel With or Without Feedback

Haim Permuter, Tsachy Weissman, and Jun Chen

Error Exponents for the Gaussian Channel with Active Noisy Feedback

Young-Han Kim, Amos Lapidoth, and Tsachy Weissman

To Feed or Not to Feed Back
Himanshu Asnani, Haim Permuter, and Tsachy Weissman

Denoising

Universal Discrete Denoising: Known Channel

Tsachy Weissman, Erik Ordentlich, Gadiel Seroussi, Sergio Verdú, and Marcelo Weinberger

Discrete Denoising for Channels with Memory

Rui Zhang and Tsachy Weissman

Reflections on the DUDE

Erik Ordentlich, Gadiel Seroussi, Sergio Verdú, Marcelo Weinberger, and Tsachy Weissman

Discrete Denoising with Shifts

Taesup Moon and Tsachy Weissman

Denoising via MCMC-based Lossy Compression
Shirin Jalali and Tsachy Weissman

Toward Universal and Practical Lossy Source Coding

A Universal Scheme for Wyner-Ziv Coding of Discrete Sources

Shirin Jalali, Sergio Verdú, and Tsachy Weissman

Lossy compression of discrete soruces via Viterbi algorithm
Shirin Jalali, Andrea Montanari, and Tsachy Weissman
Block and Sliding-Block Lossy Compression vis MCMC
Shirin Jalali and Tsachy Weissman
Rate Distortion in Near-Linear Time

Ankit Gupta, Sergio Verdú, and Tsachy Weissman

Directed Information

Directed Information, Causal Estimation, and Communication in Continuous Time
Tsachy Weissman, Young-Han Kim, and Haim H. Permuer
Interpretations of Directed Information in Portfolio Theory, Data Compression, and Hypothesis Testing
Haim Permuter, Young-Han Kim, and Tsachy Weissman
Directed Information, Causal Estimation, and Communication in Continuous Time
Tsachy Weissman, Young-Han Kim, and Haim H. Permuer
Universal Estimation of Directed Information
Jiantao Jiao, Haim H. Permuter, Lei Zhao, Young-Han Kim, and Tsachy Weissman

Delay and Complexity Constrained Information Theory

On Causal Source Codes with Side Information

Tsachy Weissman and Neri Merhav

Source Coding with Limited Side Information Lookahead at the Decoder

Tsachy Weissman and Abbas El Gamal

On Real Time Coding with Limited Lookahead
Himanshu Asnani and Tsachy Weissman

Prediction, Learning, and Sequential Decision Making

On Real Time Coding with Limited Lookahead
Himanshu Asnani and Tsachy Weissman
Universal Filtering via Prediction

Tsachy Weissman, Erik Ordentlich, Marcelo Weinberger, Anelia Somekh-Baruch, and Neri Merhav

How to filter an 'individual sequence with feedback'

Tsachy Weissman

Scanning and sequential decision making for multi-dimensional data, Part II: the noisy case

Asaf Cohen, Tsachy Weissman, and Neri Merhav

Universal Reinforcement Learning

Vivek F. Farias, Ciamac C. Moallemi, Benjamin Van Roy, and Tsachy Weissman

Entropy

On the Optimality of Symbol by Symbol Filtering and Denoising

Erik Ordentlich and Tsachy Weissman

Entropy of Hidden Markov Processes and Connections to Dynamical Systems
Brian Markus, Karl Petersen, and Tsachy Weissman (eds.)
Bounds on the Entropy Rate of Binary Hidden Markov Processes
Erik Ordentlich and Tsachy Weissman

Shannon Theory

The empirical distribution of rate-constrained codes

Tsachy Weissman and Erik Ordentlich

The Information Lost in Erasures

Sergio Verdú and Tsachy Weissman

Tighter Bounds on the Capacity of Finite-State Channels via Markov Set-Chains

Jun Chen, Haim Permuter, and Tsachy Weissman