Abstract

A major source of random variability in cortical networks is the quasi-random arrival of presynaptic action potentials from many other cells. In network studies as well as in the study of the response properties of single cells embedded in a network, synaptic background input is often approximated by Poissonian spike trains. However, the output statistics of the cells is in most cases far from being Poisson. This is inconsistent with the assumption of similar spike-train statistics for pre- and postsynaptic cells in a recurrent network. Here we tackle this problem for the popular class of integrate-and-fire neurons and study a self-consistent statistics of input and output spectra of neural spike trains. Instead of actually using a large network, we use an iterative scheme, in which we simulate a single neuron over several generations. In each of these generations, the neuron is stimulated with surrogate stochastic input that has a similar statistics as the output of the previous generation. For the surrogate input, we employ two distinct approximations: (i) a superposition of renewal spike trains with the same interspike interval density as observed in the previous generation and (ii) a Gaussian current with a power spectrum proportional to that observed in the previous generation. For input parameters that correspond to balanced input in the network, both the renewal and the Gaussian iteration procedure converge quickly and yield comparable results for the self-consistent spike-train power spectrum. We compare our results to large-scale simulations of a random sparsely connected network of leaky integrate-and-fire neurons (Brunel, 2000) and show that in the asynchronous regime close to a state of balanced synaptic input from the network, our iterative schemes provide an excellent approximations to the autocorrelation of spike trains in the recurrent network.

Highlights

  • Neurons in different parts of the nervous system respond to repeated presentation of the same stimulus with considerable trial-to-trial variability (van Steveninck et al, 1997)

  • One way to deal with temporal correlations in the input is to extend the phase space of the Fokker-Planck equation by additional variables that can account for colored noise in the input. This has been done by Câteau and Reyes (2006) for the case of green noise that arises by a presynaptic refractory period and it can be generalized and utilized to relate output spike-train statistics to temporal input statistics for a simple perfect integrate-and-fire neuron model (Schwalger et al, submitted)

  • For finite interspike interval (ISI) correlations, we can expect a discrepancy between the power spectrum of the surrogate input and the power spectrum of the output spike train, even if our scheme has converged to a stationary output spike train

Read more

Summary

INTRODUCTION

Neurons in different parts of the nervous system respond to repeated presentation of the same stimulus with considerable trial-to-trial variability (van Steveninck et al, 1997). This has been done by Câteau and Reyes (2006) for the case of green noise (high-pass filtered noise) that arises by a presynaptic refractory period and it can be generalized and utilized to relate output spike-train statistics to temporal input statistics for a simple perfect integrate-and-fire neuron model (Schwalger et al, submitted) Another approach assumes a high degree of intrinsic or external uncorrelated noise that allows for a continuous rate-equation-like description of the activity in the neural network (see e.g., studies by Doiron et al, 2004; Lindner et al, 2005b; Pernice et al, 2011; Trousdale et al, 2012 for networks of integrate-and-fire neurons and the recent review by Grytskyy et al, 2013 for other network types). We conclude by discussing the implications of our results for a more faithful description of neural noise emerging in recurrent networks

MODELS AND METHODS
RECURRENT-NETWORK MODEL
Gaussian approximation for the input of the next generation
Renewal approximation for the input of the next generation
Convergence and uniqueness of the algorithms
SELF-CONSISTENT SPECTRUM USING TWO DIFFERENT ITERATIVE SCHEMES
COMPARISON OF SPECTRA IN RECURRENT NETWORKS AND THE SELF-CONSISTENT SOLUTION
DISCUSSION
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.