Abstract

This paper is focused on a study of integral relations between the relative entropy and the chi-squared divergence, which are two fundamental divergence measures in information theory and statistics, a study of the implications of these relations, their information-theoretic applications, and some generalizations pertaining to the rich class of f-divergences. Applications that are studied in this paper refer to lossless compression, the method of types and large deviations, strong data–processing inequalities, bounds on contraction coefficients and maximal correlation, and the convergence rate to stationarity of a type of discrete-time Markov chains.

Highlights

  • The relative entropy and the chi-squared divergence [2] are divergence measures which play a key role in information theory, statistics, learning, signal processing, and other theoretical and applied branches of mathematics

  • These divergence measures are fundamental in problems pertaining to source and channel coding, combinatorics and large deviations theory, goodness-of-fit and independence tests in statistics, expectation–maximization iterative algorithms for estimating a distribution from an incomplete data, and other sorts of problems

  • We study integral relations between the relative entropy and the chi-squared divergence, implications of these relations, and some of their information-theoretic applications

Read more

Summary

Introduction

The relative entropy ( known as the Kullback–Leibler divergence [1]) and the chi-squared divergence [2] are divergence measures which play a key role in information theory, statistics, learning, signal processing, and other theoretical and applied branches of mathematics These divergence measures are fundamental in problems pertaining to source and channel coding, combinatorics and large deviations theory, goodness-of-fit and independence tests in statistics, expectation–maximization iterative algorithms for estimating a distribution from an incomplete data, and other sorts of problems (the reader is referred to the tutorial paper by Csiszár and Shields [3]). We outline the paper contributions and the structure of our manuscript

Paper Contributions
Paper Organization
Preliminaries and Notation
Relations between the Relative Entropy and the Chi-Squared Divergence
Implications of Theorem 1
Monotonic Sequences of f -Divergences and an Extension of Theorem 1
On Probabilities and f -Divergences
Application of Corollary 3
Strong Data–Processing Inequalities and Maximal Correlation
Proof of Theorem 1
Proof of Proposition 1
Proof of Theorem 2
Proof of Theorem 3
Proof of Theorem 4
Proof of Corollary 5
Proof of Theorem 5 and Corollary 6
Proof of Theorem 6
Proof of Corollary 7
5.10. Proof of Proposition 3
5.11. Proof of Proposition 4
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call