Abstract

A complex interplay of single-neuron properties and the recurrent network structure shapes the activity of cortical neurons. The single-neuron activity statistics differ in general from the respective population statistics, including spectra and, correspondingly, autocorrelation times. We develop a theory for self-consistent second-order single-neuron statistics in block-structured sparse random networks of spiking neurons. In particular, the theory predicts the neuron-level autocorrelation times, also known as intrinsic timescales, of the neuronal activity. The theory is based on an extension of dynamic mean-field theory from rate networks to spiking networks, which is validated via simulations. It accounts for both static variability, e.g. due to a distributed number of incoming synapses per neuron, and temporal fluctuations of the input. We apply the theory to balanced random networks of generalized linear model neurons, balanced random networks of leaky integrate-and-fire neurons, and a biologically constrained network of leaky integrate-and-fire neurons. For the generalized linear model network with an error function nonlinearity, a novel analytical solution of the colored noise problem allows us to obtain self-consistent firing rate distributions, single-neuron power spectra, and intrinsic timescales. For the leaky integrate-and-fire networks, we derive an approximate analytical solution of the colored noise problem, based on the Stratonovich approximation of the Wiener-Rice series and a novel analytical solution for the free upcrossing statistics. Again closing the system self-consistently, in the fluctuation-driven regime this approximation yields reliable estimates of the mean firing rate and its variance across neurons, the inter-spike interval distribution, the single-neuron power spectra, and intrinsic timescales.

Highlights

  • Neural dynamics in the cerebral cortex of awake behaving animals unfolds over multiple timescales, ranging from milliseconds up to seconds and more [1,2,3,4,5]

  • The recurrent inputs ηi(t ) can be approximated by independent Gaussian processes, which leads to a coarse-grained description of the dynamics: since all inputs are statistically equivalent, the neurons become statistically equivalent as well and the system reduces to N independent, identical stochastic equations

  • We developed a self-consistent theory for the second-order statistics, in particular the intrinsic timescales as defined by autocorrelation decay times, in block-structured random networks of spiking neurons in an asynchronous irregular state

Read more

Summary

INTRODUCTION

Neural dynamics in the cerebral cortex of awake behaving animals unfolds over multiple timescales, ranging from milliseconds up to seconds and more [1,2,3,4,5]. Our approximation leads to integrals of which the computationally most involved ones can be solved analytically We use these results to explore the parameter space of a balanced random network of LIF neurons for long timescales, and apply the theory to a more elaborate model with population–specific connection probabilities that are constrained by biological data [43]. We start this manuscript with the derivation of the DMFT equations from the characteristic functional of the recurrent input. We use our theory to investigate the timescale in the respective network models

MICROSCOPIC THEORY OF INTRINSIC TIMESCALES
Input statistics
Gaussian process approximation
Self–consistency problem
Static contribution
Multiple populations
External input
Output statistics
Timescale
Spike train power spectrum
Comparison with simulations
GENERALIZED LINEAR MODEL NEURONS
Neuron dynamics
Colored noise problem
Exponential nonlinearity
Error function nonlinearity
Numerical solution of the self-consistency problem
Balanced random network
Exponential nonlinearity: absence of long timescales
Error function nonlinearity: existence of long timescales
Error function nonlinearity: mechanism of timescale
Error function nonlinearity: external timescale
LEAKY INTEGRATE-AND-FIRE NEURONS
Effective stochastic dynamics
Wiener–Rice series and Stratonovich approximation
Timescales in balanced random networks of LIF neurons
Simulation of balanced random network of LIF neurons
Biologically constrained network model
DISCUSSION
Stochastic processes
Point processes
Gaussian integrals
Nonstationary mean and variance of U and I
Initial velocity distribution
One-point upcrossing probability
Stationary correlation function of U and U
Stationary two-point upcrossing probability
Findings
Numerics

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.