Abstract

We propose to exploit the structure of the correlation between two random variables X and Y via a relaxation on the Common Information problem of Gács and Körner (GK Common Information). Consider two correlated sources X and Y generated from a joint distribution P_X;Y . We consider embeddings of X into discrete random variables U, such that H(U|Y )<∂, while maximizing I(X;U). When ∂= 0, this reduces to the GK Common Information problem. However, unlike the GK Common Information, which is known to be zero for many pairs of random variables (X; Y ), we show that this relaxation allows to capture the structure in the correlation between X and Y for a much broader range of joint distributions, and showcase applications for some problems in multi-terminal information theory.


Presenters

Muriel Medard

Massachusetts Institute of Technology

Asaf Cohen

Ben Gurion University of the Negev

Session Chair

Mokshay Madiman

University of Delaware