Abstract

We study a generalized version of Wyner's common information problem (also coined the distributed sources simulation problem). The original common information problem is to characterize the minimum rate of the common input to independent processors to generate an approximation of a joint distribution when the distance measure used to quantify the discrepancy between the synthesized and target distributions is the normalized relative entropy. Our generalization involves changing the distance measure to the unnormalized and normalized Renyi divergences of order $\alpha=1+s\in[0,2]$ . We show that the minimum rate needed to ensure the Renyi divergences between the distribution induced by a code and the target distribution vanishes remains the same as the one in Wyner's setting, except when the order $\alpha=1+s=0$ . This implies that Wyner's common information is rather robust to the choice of distance measure employed. As a byproduct of the proofs used to establish the above results, the exponentially strong converse for the common information problem under the total variation distance measure is established.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call