Abstract

The Brascamp-Lieb inequality in functional analysis can be viewed as a measure of the “uncorrelatedness” of a joint probability distribution. We define the smooth Brascamp-Lieb (BL) divergence as the infimum of the best constant in the Brascamp-Lieb inequality under a perturbation of the joint probability distribution. An information spectrum upper bound on the smooth BL divergence is proved, using properties of the subgradient of a certain convex functional. In particular, in the i.i.d. setting, such an infimum converges to the best constant in a certain mutual information inequality. We then derive new single-shot converse bounds for the omniscient helper common randomness generation problem and the Gray-Wyner source coding problem in terms of the smooth BL divergence, where the proof relies on the functional formulation of the Brascamp-Lieb inequality. Exact second-order rates are thus obtained in the stationary memoryless and nonvanishing error setting. These offer rare instances of strong converses/second-order converses for continuous sources when the rate region involves auxiliary random variables.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call