Abstract

Recently, two extensions of Wyner’s common information—exact and Renyi common informations—were introduced respectively by Kumar, Li, and El Gamal (KLE), and the present authors. The class of common information problems involves determining the minimum rate of the common input to two independent processors needed to exactly or approximately generate a target joint distribution. For the exact common information problem, exact generation of the target distribution is required, while for Wyner’s and $\alpha $ -Renyi common informations, the relative entropy and Renyi divergence with order $\alpha $ were respectively used to quantify the discrepancy between the synthesized and target distributions. The exact common information is larger than or equal to Wyner’s common information. However, it was hitherto unknown whether the former is strictly larger than the latter for some joint distributions. In this paper, we first establish the equivalence between the exact and $\infty $ -Renyi common informations, and then provide single-letter upper and lower bounds for these two quantities. For doubly symmetric binary sources, we show that the upper and lower bounds coincide, which implies that for such sources, the exact and $\infty $ -Renyi common informations are completely characterized. Interestingly, we observe that for such sources, these two common informations are strictly larger than Wyner’s. This answers an open problem posed by KLE. Furthermore, we extend Wyner’s, $\infty $ -Renyi, and exact common informations to sources with countably infinite or continuous alphabets, including Gaussian sources.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call