Abstract

It is well known that joint interactions between agents can be described qualitatively as having synergistic, unique, and redundant components. In recent years, there have been renewed efforts to decompose mutual information, a general, non-parametric measure of joint interactions, into constituent parts. We propose a novel, non-negative decomposition of mutual information between two sources and a target variable. The decomposition is for the exponential family, and thus can be applied to a broad range of distributions. We also show that values from our decomposition arise naturally from testing hypotheses of conditional dependence. We demonstrate the method numerically using standard binary logic gates and Gaussian channels, as well as apply the method to investigate redundancy between brain regions using an fMRI-based image classification data-set.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.