Abstract

This paper aims to estimate the information between two random phenomena by using consolidated second-order statistics tools. The squared-loss mutual information, a surrogate of the Shannon mutual information, is chosen due to its property of being expressed as a second-order moment. We first review the rationale for i.i.d. discrete sources, which involves mapping the data onto the simplex space, and we highlight the links with other well-known related concepts in the literature based on local approximations of information-theoretic measures. Then, the problem is translated to analog sources by mapping the data onto the characteristic space, focusing on the adaptability between the discrete and the analog case and its limitations. The proposed approach gains interpretability and scalability for its use on large data sets, providing a unified rationale for the free regularization parameters. Moreover, the structure of the proposed mapping allows resorting to Szegö’s theorem to reduce the complexity for high dimensional mappings, exhibiting a strong duality with spectral analysis. The performance of the developed estimators is analyzed using Gaussian mixtures.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.