Abstract

An important problem in computational neuroscience is to understand how networks of spiking neurons can carry out various computations underlying behavior. Balanced spiking networks (BSNs) provide a powerful framework for implementing arbitrary linear dynamical systems in networks of integrate-and-fire neurons. However, the classic BSN model requires near-instantaneous transmission of spikes between neurons, which is biologically implausible. Introducing realistic synaptic delays leads to an pathological regime known as “ping-ponging”, in which different populations spike maximally in alternating time bins, causing network output to overshoot the target solution. Here we document this phenomenon and provide a novel solution: we show that a network can have realistic synaptic delays while maintaining accuracy and stability if neurons are endowed with conditionally Poisson firing. Formally, we propose two alternate formulations of Poisson balanced spiking networks: (1) a “local” framework, which replaces the hard integrate-and-fire spiking rule within each neuron by a “soft” threshold function, such that firing probability grows as a smooth nonlinear function of membrane potential; and (2) a “population” framework, which reformulates the BSN objective function in terms of expected spike counts over the entire population. We show that both approaches offer improved robustness, allowing for accurate implementation of network dynamics with realistic synaptic delays between neurons. Both Poisson frameworks preserve the coding accuracy and robustness to neuron loss of the original model and, moreover, produce positive correlations between similarly tuned neurons, a feature of real neural populations that is not found in the deterministic BSN. This work unifies balanced spiking networks with Poisson generalized linear models and suggests several promising avenues for future research.

Highlights

  • The brain carries out a wide variety of computations that can be implemented by dynamical systems, from sensory integration [1,2,3,4], to working memory [5,6,7], to movement planning and execution [8,9,10]

  • An important example of this second approach is the balanced spiking network (BSN) framework introduced by Boerlin et al [31]

  • The BSN model consists of a network of coupled leaky integrate-and-fire (LIF) neurons that can emulate an arbitrary linear dynamical system (LDS)

Read more

Summary

Introduction

The brain carries out a wide variety of computations that can be implemented by dynamical systems, from sensory integration [1,2,3,4], to working memory [5,6,7], to movement planning and execution [8,9,10]. The existence of such computations in the brain is well established, the mechanisms by which these computations are implemented in networks of neurons remains poorly understood One approach to this problem involves statistical modeling, which uses descriptive statistical methods to infer the dynamics of neural activity from recorded spike trains [10,11,12,13,14,15,16,17,18,19,20,21]. The population is divided into “excitatory” and “inhibitory” populations of neurons, based on whether they contribute positively or negatively to the output This leads to an intuitive spiking rule: a neuron should spike whenever doing so will reduce the error between the output of the target LDS and the network output, i.e., the weighted combination of filtered spikes emitted so far. A neuron’s membrane potential is a local representation of the network-wide error between target output and current network output, and its spike threshold is proportional to the amount by which adding a spike will reduce this error

Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call