Abstract

The two most prevalent notions of common information are due to Wyner and Gács-Körner and both the notions can be stated as two different characteristic points in the lossless Gray-Wyner region. Although these quantities can be easily evaluated for random variables with infinite entropy (eg. continuous random variables), the operational significance underlying their definition is applicable only to the lossless framework. The primary objective of this paper is to generalize these two notions of common information to the lossy Gray-Wyner network, which extends the theoretical intuition underlying their definitions for general sources and distortion measures. We begin with the lossy generalization of Wyner's common information, defined as the minimum rate on the shared branch of the Gray-Wyner network at minimum sum rate when the two decoders reconstruct the sources subject to individual distortion constraints. We derive a complete single letter information theoretic characterization for this quantity and use it to compute the common information of symmetric bivariate Gaussian random variables. We then derive similar results to generalize Gács-Körner's definition to the lossy framework. These two characterizations allow us to carry the practical insight underlying the two notions of common information to general sources and distortion measures.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call