Abstract

Two, partially interwoven, hot topics in the analysis and statistical modeling of neural data, are the development of efficient and informative representations of the time series derived from multiple neural recordings, and the extraction of information about the connectivity structure of the underlying neural network from the recorded neural activities. In the present paper we show that state-space clustering can provide an easy and effective option for reducing the dimensionality of multiple neural time series, that it can improve inference of synaptic couplings from neural activities, and that it can also allow the construction of a compact representation of the multi-dimensional dynamics, that easily lends itself to complexity measures. We apply a variant of the ‘mean-shift’ algorithm to perform state-space clustering, and validate it on an Hopfield network in the glassy phase, in which metastable states are largely uncorrelated from memories embedded in the synaptic matrix. In this context, we show that the neural states identified as clusters’ centroids offer a parsimonious parametrization of the synaptic matrix, which allows a significant improvement in inferring the synaptic couplings from the neural activities. Moving to the more realistic case of a multi-modular spiking network, with spike-frequency adaptation inducing history-dependent effects, we propose a procedure inspired by Boltzmann learning, but extending its domain of application, to learn inter-module synaptic couplings so that the spiking network reproduces a prescribed pattern of spatial correlations; we then illustrate, in the spiking network, how clustering is effective in extracting relevant features of the network’s state-space landscape. Finally, we show that the knowledge of the cluster structure allows casting the multi-dimensional neural dynamics in the form of a symbolic dynamics of transitions between clusters; as an illustration of the potential of such reduction, we define and analyze a measure of complexity of the neural time series.

Highlights

  • Technology nowadays allows neuroscientists to simultaneously record brain activity from increasingly many channels, at multiple scales; recent years witnessed a kind of ‘Moore’s Law’ for neural recordings [1], and this poses new challenges and opens new opportunities.One obvious challenge is to devise data representations that convey in a compact form the spatio-temporal structure of the recorded data

  • In the figure we report for comparison the R value obtained for the centroid sequence extracted from the time series generated by the reference Hopfield network; it is Clustering for multi-channel neural data seen that, the centroid sequence deviates from a Markov process, the relative difference w.r.t. the surrogate sequence is very small, much smaller than the one for the spiking dynamics, where spike-frequency adaptation (SFA) plays a major role

  • We started from a simple idea: to represent the multidimensional time series of neural activities as a density distribution in a corresponding multidimensional space, and perform a density-based clustering procedure to extract the local density maxima

Read more

Summary

Introduction

Technology nowadays allows neuroscientists to simultaneously record brain activity from increasingly many channels, at multiple scales; recent years witnessed a kind of ‘Moore’s Law’ for neural recordings [1], and this poses new challenges and opens new opportunities.One obvious challenge is to devise data representations that convey in a compact form the spatio-temporal structure of the recorded data. Correlations measured from single neuron pairs obviously can only provide ambiguous estimates of the direct synaptic couplings (due to confounding causes like common input to the sampled neurons) It was noticed in a landmark paper [3] that when many (order 100 for instance) simultaneous recordings are available, even though the underlying biological neural network is still dramatically undersampled, the global pattern of (individually small) pairwise correlations allows to extract meaningful information about the synaptic connectivity. Many efforts were subsequently devoted both to extend the approach to non-equilibrium estimates, and to lighten the computational load of maximum entropy estimates (Boltzmann learning) through various mean-field approximations (see e.g. [4] [5] [6])

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call