Abstract

Two families of dependence measures between random variables are introduced. They are based on the Rényi divergence of order and the relative -entropy, respectively, and both dependence measures reduce to Shannon’s mutual information when their order is one. The first measure shares many properties with the mutual information, including the data-processing inequality, and can be related to the optimal error exponents in composite hypothesis testing. The second measure does not satisfy the data-processing inequality, but appears naturally in the context of distributed task encoding.

Highlights

  • The solutions to many information-theoretic problems can be expressed using Shannon’s information measures such as entropy, relative entropy, and mutual information

  • QY = PY . (When α is one, the minimum is always achieved by Q X = PX and QY = PY ; this follows from Proposition 8 and the fact that D1 ( PXY k Q X QY ) = ∆1 ( PXY k Q X QY ) = D ( PXY k Q X QY ).)

  • We prove the claim for α ∈ (1, ∞); for α ∈ {1, ∞} the claim will hold because Jα ( X; Y ) is continuous in α (Lemma 10)

Read more

Summary

Introduction

The solutions to many information-theoretic problems can be expressed using Shannon’s information measures such as entropy, relative entropy, and mutual information. (When α is one, the minimum is always achieved by Q X = PX and QY = PY ; this follows from Proposition 8 and the fact that D1 ( PXY k Q X QY ) = ∆1 ( PXY k Q X QY ) = D ( PXY k Q X QY ).). In both plots, X is Bernoulli with Pr( X = 1) = 0.2, and Y is equal to X.

Related Work
Operational Meanings
Preliminaries
Two Measures of Dependence
Proofs
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call