Abstract

We develop two new multivariate statistical dependence measures. First, based on the Kullback-Leibler distance, results in a single value that indicates the general level of dependence among the random variables. Second, based on an orthonormal series expansion of joint probability density functions provides more detail about the nature of the dependence. We apply these dependence measures to the analysis of simultaneous recordings made from multiple neurons, in which dependencies are time-varying and potentially information bearing.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call