Abstract

We introduce a new formalism for evaluating analytically the cross-correlation structure of a finite-size firing-rate network with recurrent connections. The analysis performs a first-order perturbative expansion of neural activity equations that include three different sources of randomness: the background noise of the membrane potentials, their initial conditions, and the distribution of the recurrent synaptic weights. This allows the analytical quantification of the relationship between anatomical and functional connectivity, i.e. of how the synaptic connections determine the statistical dependencies at any order among different neurons. The technique we develop is general, but for simplicity and clarity we demonstrate its efficacy by applying it to the case of synaptic connections described by regular graphs. The analytical equations so obtained reveal previously unknown behaviors of recurrent firing-rate networks, especially on how correlations are modified by the external input, by the finite size of the network, by the density of the anatomical connections and by correlation in sources of randomness. In particular, we show that a strong input can make the neurons almost independent, suggesting that functional connectivity does not depend only on the static anatomical connectivity, but also on the external inputs. Moreover we prove that in general it is not possible to find a mean-field description à la Sznitman of the network, if the anatomical connections are too sparse or our three sources of variability are correlated. To conclude, we show a very counterintuitive phenomenon, which we call stochastic synchronization, through which neurons become almost perfectly correlated even if the sources of randomness are independent. Due to its ability to quantify how activity of individual neurons and the correlation among them depends upon external inputs, the formalism introduced here can serve as a basis for exploring analytically the computational capability of population codes expressed by recurrent neural networks.Electronic Supplementary MaterialThe online version of this article (doi:10.1186/s13408-015-0020-y) contains supplementary material 1.

Highlights

  • The brain is a complex system whose information processing capabilities critically rely on the interactions between neurons

  • The network we considered is stochastic and includes three distinct sources of randomness, namely the background noise of the membrane potentials, their initial conditions and the distribution of the recurrent synaptic weights. With this approach we succeeded in calculating analytically correlations at any order among all groups of neurons in the network. This formalism is general and in principle can be applied to networks with any kind of topology of the anatomical connections, but here we applied it to the case of regular graphs

  • To spectral graph theory, where the properties of a graph are studied in relationship to its characteristic polynomial and eigenquantities, in this article we have found the relation between the functional connectivity and the spectrum of the underlying structural connectivity

Read more

Summary

Introduction

The brain is a complex system whose information processing capabilities critically rely on the interactions between neurons. In [30,31,32] the authors considered a discrete-time network of rate neurons, whose sources of randomness were background Brownian motions for the membrane potentials and normally distributed synaptic weights Building on these previous attempts to study network correlations including finitesize effects that go beyond the mean-field approximation, here we develop an approach based upon a first-order perturbative expansion of the neural equations. The propagation of chaos refers to the fact that if the initial conditions are ν0-chaotic, if the neurons are exchangeable, their joint law μ(tN) is νt -chaotic for some probability measure νt on Rd for all times t ∈ [0, T ] This formalism and this model, we quantify analytically how synaptic connections determine statistical dependencies at any order ( at the pairwise level, as in previous studies) among different neurons.

Description of the Model
Perturbative Expansion
Cross-Correlation and Probability Density
Other Measures of Functional Connectivity
Examples
Block-Circulant Matrices with Circulant Blocks
Symmetric Matrices
Product of Regular Graphs
Irregular Graphs
Numerical Comparison
Correlation as a Function of the Strength of the Network’s Input
Failure of Sznitman’s Mean-Field Theory
Chaos Does not Occur if the Sources of Randomness Are not Independent
Propagation of Chaos Does not Occur in Sufficiently Sparse Networks
10 Stochastic Synchronization
10.1 The General Theory
10.2 Examples
11 Discussion
11.1 Dependence of the Correlation Structure on the Parameters of the System
11.2 Strengths and Weaknesses of the Presented Approach
11.3 Analyzing the Consequences of Structural Damage
11.4 Possible Extensions to Other Measures of Communication Among Neurons
11.5 Concluding Statement
The Logistic Function
The Inverse Tangent Function
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.