Abstract

We begin by reviewing the statistical framework of information theory as applicable to neuroimaging data analysis. A major factor hindering wider adoption of this framework in neuroimaging is the difficulty of estimating information theoretic quantities in practice. We present a novel estimation technique that combines the statistical theory of copulas with the closed form solution for the entropy of Gaussian variables. This results in a general, computationally efficient, flexible, and robust multivariate statistical framework that provides effect sizes on a common meaningful scale, allows for unified treatment of discrete, continuous, unidimensional and multidimensional variables, and enables direct comparisons of representations from behavioral and brain responses across any recording modality. We validate the use of this estimate as a statistical test within a neuroimaging context, considering both discrete stimulus classes and continuous stimulus features. We also present examples of analyses facilitated by these developments, including application of multivariate analyses to MEG planar magnetic field gradients, and pairwise temporal interactions in evoked EEG responses. We show the benefit of considering the instantaneous temporal derivative together with the raw values of M/EEG signals as a multivariate response, how we can separately quantify modulations of amplitude and direction for vector quantities, and how we can measure the emergence of novel information over time in evoked responses. Open‐source Matlab and Python code implementing the new methods accompanies this article. Hum Brain Mapp 38:1541–1573, 2017. © 2016 Wiley Periodicals, Inc.

Highlights

  • Mutual information (MI) measures the statistical dependence between two random variables [Cover and Thomas, 1991; Shannon, 1948]

  • Our primary intention is to present our new Gaussian Copula Mutual Information (GCMI) estimator as the effect size for a practical statistical test for neuroimaging, that can be considered as a drop-in replacement for a number of different established statistical measures (Table I)

  • We demonstrate the bias and mean-square error of the GCMI estimator compared to other mutual information (MI) estimators on simulated systems as well as the example datasets (Section 4.4)

Read more

Summary

Introduction

Mutual information (MI) measures the statistical dependence between two random variables [Cover and Thomas, 1991; Shannon, 1948]. It can be viewed as a statistical test against a null hypothesis that two variables are statistically independent, but in addition its effect size (measured in bits) has a number of useful properties and interpretations [Kinney and Atwal, 2014). There is a long history of applications of MI for the study of neural activity [Borst and Theunissen, 1999; Eckhorn and Po€pel, 1974; Fairhall et al, 2012; Nelken and Chechik, 2007; Rolls and Treves, 2011; Victor, 2006]. Recent studies have begun to explore its application to neuroimaging [Afshin-Pour et al, 2011; Caballero-Gaudes et al, 2013; Gross et al, 2013; Guggenmos et al, 2015; Ostwald and Bagshaw, 2011; Panzeri et al, 2008; Salvador et al, 2007; Saproo and Serences, 2010; Schyns et al, 2011; Serences et al, 2009]

Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call