Abstract

The application of the Maximum Entropy (ME) principle leads to a minimum of the Mutual Information (MI), I(X,Y), between random variables X,Y, which is compatible with prescribed joint expectations and given ME marginal distributions. A sequence of sets of joint constraints leads to a hierarchy of lower MI bounds increasingly approaching the true MI. In particular, using standard bivariate Gaussian marginal distributions, it allows for the MI decomposition into two positive terms: the Gaussian MI (Ig), depending upon the Gaussian correlation or the correlation between ‘Gaussianized variables’, and a non‑Gaussian MI (Ing), coinciding with joint negentropy and depending upon nonlinear correlations. Joint moments of a prescribed total order p are bounded within a compact set defined by Schwarz-like inequalities, where Ing grows from zero at the ‘Gaussian manifold’ where moments are those of Gaussian distributions, towards infinity at the set’s boundary where a deterministic relationship holds. Sources of joint non-Gaussianity have been systematized by estimating Ing between the input and output from a nonlinear synthetic channel contaminated by multiplicative and non-Gaussian additive noises for a full range of signal-to-noise ratio (snr) variances. We have studied the effect of varying snr on Ig and Ing under several signal/noise scenarios.

Highlights

  • One of the most commonly used information theoretic measures is the mutual information (MI) [1], measuring the total amount of probabilistic dependence among random variables (RVs)—see [2] for a unifying perspective and axiomatic review

  • In order to build a sequence of MI lower bounds and use the procedure of Section 2.3, we have considered the sequence of information moment sets of even order (T2, θ2 ),(T4, θ4 ),(T6, θ6 ),... with any pair of consecutive sets satisfying the premises of Theorem 1, i.e., all independent moment sets are Maximum Entropy (ME)-congruent

  • We have addressed the problem of finding the minimum mutual information (MinMI), or the least noncommittal MI between d = 2 random variables, consistent with a set of marginal and joint expectations

Read more

Summary

Introduction

One of the most commonly used information theoretic measures is the mutual information (MI) [1], measuring the total amount of probabilistic dependence among random variables (RVs)—see [2] for a unifying perspective and axiomatic review. The goal is the determination of theoretical lower MI bounds under certain conditions or, in other words, the minimum mutual information (MinMI) [12] between two RVs X , Y , consistent, both with imposed marginal distributions and cross-expectations assessing their linear and nonlinear covariability. Those lower bounds can be obtained due to the application of the Maximum. This paper is followed by a companion one [27] on the estimation of non-Gaussian MI from finite samples with practical applications

MI Estimation from Maximum Entropy PDFs
General Properties of Bivariate Mutual Information
Congruency between Information Moment Sets
MI Estimation from Maximum Entropy Anamorphoses
Gaussian Anamorphosis and Gaussian Correlation
Gaussian and Non-Gaussian MI
The Sequence of Non-Gaussian MI Lower Bounds from Cross-Constraints
Non-Gaussian MI across the Polytope of Cross Moments
The Effect of Noise and Nonlinearity on Non-Gaussian MI
Discussion and Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call