Abstract

Recently, Kumar, Li, and EI Gamal proposed a notion of common information using a variation of a setup used to define Wyner common information rate. This notion, known as the exact common information, is the minimum common randomness required for the exact and separate generation of a pair of correlated discrete memoryless sources. While exact common information rate is not known to have a single-letter characterization, it was shown to equal the Wyner common information rate for the symmetric binary erasure source in Kumar-Li-EI Gamal-ISIT2014. The authors extended this result to establish the equality of the two notions of common information for general noisy typewriter, $Z$ - and erasure sources in Vellambi - Kliewer - Allerton 2016. In this work, we investigate the connection between exact and Wyner common information rates to derive two new implicit conditions (on the joint source distribution) that ensure the equality of the two notions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call