Abstract

Average mutual information (AMI) measures the dependence between pairs of random variables. It has been used in many applications including blind source separation, data mining, neural synchronicity assessment, and state space reconstruction in human movement studies. Presently, several algorithms and computational code exist to estimate AMI. However, most are difficult to use and/or understand the manner by which AMI is calculated. We offer a straightforward and implementable function in Matlab (Mathworks, Inc.) for the computation of AMI in relatively modest sized data streams (N<∼15,000). Our algorithm incorporates some best practices for statistical estimation that improves accuracy over other readily available options. We present three validation tests: (i) recovery of a known theoretical expected mutual information in a bivariate Gaussian random variable, (ii) invariance with respect to marginal distribution characteristics, and (iii) optimum time-delay selection in state space reconstruction.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.