Abstract

Correlations in spike-train ensembles can seriously impair the encoding of information by their spatio-temporal structure. An inevitable source of correlation in finite neural networks is common presynaptic input to pairs of neurons. Recent studies demonstrate that spike correlations in recurrent neural networks are considerably smaller than expected based on the amount of shared presynaptic input. Here, we explain this observation by means of a linear network model and simulations of networks of leaky integrate-and-fire neurons. We show that inhibitory feedback efficiently suppresses pairwise correlations and, hence, population-rate fluctuations, thereby assigning inhibitory neurons the new role of active decorrelation. We quantify this decorrelation by comparing the responses of the intact recurrent network (feedback system) and systems where the statistics of the feedback channel is perturbed (feedforward system). Manipulations of the feedback statistics can lead to a significant increase in the power and coherence of the population response. In particular, neglecting correlations within the ensemble of feedback channels or between the external stimulus and the feedback amplifies population-rate fluctuations by orders of magnitude. The fluctuation suppression in homogeneous inhibitory networks is explained by a negative feedback loop in the one-dimensional dynamics of the compound activity. Similarly, a change of coordinates exposes an effective negative feedback loop in the compound dynamics of stable excitatory-inhibitory networks. The suppression of input correlations in finite networks is explained by the population averaged correlations in the linear network model: In purely inhibitory networks, shared-input correlations are canceled by negative spike-train correlations. In excitatory-inhibitory networks, spike-train correlations are typically positive. Here, the suppression of input correlations is not a result of the mere existence of correlations between excitatory (E) and inhibitory (I) neurons, but a consequence of a particular structure of correlations among the three possible pairings (EE, EI, II).

Highlights

  • Neurons generate signals by weighting and combining input spike trains from presynaptic neuron populations

  • We show that, in the presence of negative feedback, the effect of shared input caused by the structure of the network is compensated by its recurrent dynamics

  • By replacing the feedback by an inhomogeneous Poisson process with a time dependent intensity which is identical to the population rate in the recurrent network, we found that these oscillatory modes are neither suppressed nor amplified by the recurrent dynamics, i.e. the peaks in the resulting power-spectra have the same amplitude in the feedback and in the feedforward case

Read more

Summary

Introduction

Neurons generate signals by weighting and combining input spike trains from presynaptic neuron populations. The number of possible signals which can be read out this way from a given spiketrain ensemble is maximal if these spike trains span an orthogonal basis, i.e. if they are uncorrelated [1] If they are correlated, the amount of information which can be encoded in the spatiotemporal structure of these spike trains is limited. Correlations impair the ability of readout neurons to decode information reliably in the presence of noise. This is often discussed in the context of rate coding: for N uncorrelated spike trains, the signal-to-noise ratio of the compound spike-count signal can be enhanced by increasing the population size N. The robustness of neuronal responses against noise critically depends on the level of correlated activity within the presynaptic neuron population

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call