Abstract

We describe two multivariate statistical dependence measures which can be orthogonally decomposed to separate the effects of pairwise, triplewise, and higher order interactions between the random variables. These decompositions provide a convenient method of analyzing statistical dependencies between large groups of random variables, within which smaller sub-groups may exhibit dependencies separately from the rest of the variables. The first dependence measure is a generalization of Pearson's /spl phi//sup 2/, and we decompose it using an orthonormal series expansion of joint probability density functions. The second measure is based on the Kullback-Leibler distance, and we decompose it using information geometry. Applications of these techniques include analysis of neural population recordings and multimodal sensor fusion. We discuss in detail the simple example of three jointly defined binary random variables.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.