Discrete Synaptic Events Induce Global Oscillations in Balanced Neural Networks.

  • Abstract
  • Literature Map
  • Similar Papers
Abstract
Translate article icon Translate Article Star icon
Take notes icon Take Notes

Despite the fact that neural dynamics is triggered by discrete synaptic events, the neural response is usually obtained within the diffusion approximation representing the synaptic inputs as Gaussian noise. We derive a mean-field formalism encompassing synaptic shot noise for sparse balanced neural networks. For low (high) excitatory drive (inhibitory feedback) global oscillations emerge via continuous or hysteretic transitions, correctly predicted by our approach, but not from the diffusion approximation. At sufficiently low in-degrees the nature of these global oscillations changes from drift driven to cluster activation.

Similar Papers
  • Research Article
  • Cite Count Icon 1
  • 10.1103/47h5-fbyy
Synaptic shot noise triggers fast and slow global oscillations in balanced neural networks.
  • Sep 2, 2025
  • Physical review. E
  • Denis S Goldobin + 4 more

Neural dynamics is determined by the transmission of discrete synaptic pulses (synaptic shot noise) among neurons. However, the neural responses are usually obtained within the diffusion approximation modeling synaptic inputs as continuous Gaussian noise. Here we present a rigorous mean-field theory that encompasses synaptic shot noise for sparse balanced inhibitory neural networks driven by an excitatory drive. Our theory predicts alternative dynamical regimes, in agreement with numerical simulations, which are not captured by the classical diffusion approximation. Notably, these regimes feature self-sustained global oscillations emerging at low connectivity (in-degree) via either continuous or hysteretic transitions and characterized by irregular neural activity, as expected for balanced dynamics. For sufficiently weak (strong) excitatory drive (inhibitory feedback) the transition line displays a peculiar reentrant shape revealing the existence of global oscillations at low and high in-degrees, separated by an asynchronous regime at intermediate levels of connectivity. The mechanisms leading to the emergence of these global oscillations are distinct: drift-driven at high connectivity and cluster activation at low connectivity. The frequency of these two kinds of global oscillations can be varied from slow (∼1Hz) to fast (∼100Hz) without altering their microscopic and macroscopic features by adjusting the excitatory drive and the synaptic inhibition strength in a prescribed way. Furthermore, the cluster-activated oscillations at low in-degrees could correspond to the γ rhythms reported in mammalian cortex and hippocampus and attributed to ensembles of inhibitory neurons sharing few synaptic connections [Buzsáki and Wang, Annu. Rev. Neurosci. 35, 203 (2012)0147-006X10.1146/annurev-neuro-062111-150444].

  • Research Article
  • Cite Count Icon 899
  • 10.1162/089976699300016179
Fast global oscillations in networks of integrate-and-fire neurons with low firing rates.
  • Oct 1, 1999
  • Neural Computation
  • Nicolas Brunel + 1 more

We study analytically the dynamics of a network of sparsely connected inhibitory integrate-and-fire neurons in a regime where individual neurons emit spikes irregularly and at a low rate. In the limit when the number of neurons --> infinity, the network exhibits a sharp transition between a stationary and an oscillatory global activity regime where neurons are weakly synchronized. The activity becomes oscillatory when the inhibitory feedback is strong enough. The period of the global oscillation is found to be mainly controlled by synaptic times but depends also on the characteristics of the external input. In large but finite networks, the analysis shows that global oscillations of finite coherence time generically exist both above and below the critical inhibition threshold. Their characteristics are determined as functions of systems parameters in these two different regions. The results are found to be in good agreement with numerical simulations.

  • Research Article
  • Cite Count Icon 239
  • 10.1016/j.neuron.2008.05.008
High-Frequency Organization and Synchrony of Activity in the Purkinje Cell Layer of the Cerebellum
  • Jun 1, 2008
  • Neuron
  • Camille De Solages + 8 more

High-Frequency Organization and Synchrony of Activity in the Purkinje Cell Layer of the Cerebellum

  • Research Article
  • Cite Count Icon 1
  • 10.1016/j.proeng.2011.08.582
Feedback-dependence and robustness of gamma oscillations in networks with excitatory and inhibitory neurons
  • Jan 1, 2011
  • Procedia Engineering
  • Jinli Xie + 2 more

Feedback-dependence and robustness of gamma oscillations in networks with excitatory and inhibitory neurons

  • Research Article
  • Cite Count Icon 63
  • 10.7554/elife.13824
Neural oscillations as a signature of efficient coding in the presence of synaptic delays.
  • Jul 7, 2016
  • eLife
  • Matthew Chalk + 2 more

Cortical networks exhibit 'global oscillations', in which neural spike times are entrained to an underlying oscillatory rhythm, but where individual neurons fire irregularly, on only a fraction of cycles. While the network dynamics underlying global oscillations have been well characterised, their function is debated. Here, we show that such global oscillations are a direct consequence of optimal efficient coding in spiking networks with synaptic delays and noise. To avoid firing unnecessary spikes, neurons need to share information about the network state. Ideally, membrane potentials should be strongly correlated and reflect a 'prediction error' while the spikes themselves are uncorrelated and occur rarely. We show that the most efficient representation is when: (i) spike times are entrained to a global Gamma rhythm (implying a consistent representation of the error); but (ii) few neurons fire on each cycle (implying high efficiency), while (iii) excitation and inhibition are tightly balanced. This suggests that cortical networks exhibiting such dynamics are tuned to achieve a maximally efficient population code.

  • Research Article
  • Cite Count Icon 15
  • 10.7554/elife.13824.015
Neural oscillations as a signature of efficient coding in the presence of synaptic delays
  • Jun 2, 2016
  • eLife
  • Matthew Chalk + 2 more

Cortical networks exhibit 'global oscillations', in which neural spike times are entrained to an underlying oscillatory rhythm, but where individual neurons fire irregularly, on only a fraction of cycles. While the network dynamics underlying global oscillations have been well characterised, their function is debated. Here, we show that such global oscillations are a direct consequence of optimal efficient coding in spiking networks with synaptic delays and noise. To avoid firing unnecessary spikes, neurons need to share information about the network state. Ideally, membrane potentials should be strongly correlated and reflect a 'prediction error' while the spikes themselves are uncorrelated and occur rarely. We show that the most efficient representation is when: (i) spike times are entrained to a global Gamma rhythm (implying a consistent representation of the error); but (ii) few neurons fire on each cycle (implying high efficiency), while (iii) excitation and inhibition are tightly balanced. This suggests that cortical networks exhibiting such dynamics are tuned to achieve a maximally efficient population code.DOI:http://dx.doi.org/10.7554/eLife.13824.001

  • PDF Download Icon
  • Abstract
  • 10.1186/1471-2202-14-s1-p395
The role of environmental feedback in a brain state switch from passive to active sensing
  • Jul 1, 2013
  • BMC Neuroscience
  • Christopher L Buckley + 1 more

Coherent behaviour emerges from mutual interaction between the brain, body and environment across multiple timescales and not from within the brain alone [1,2]. For example sensation is actively shaped by dynamical interaction of the brain and environment through motor actions such as sniffing, saccading, and touching. The onset of active sensing is often concomitant with qualitative changes in neural dynamics [3] and responses to sensory input [4,5]. Indeed, some neural responses are uniquely sensitive to the presence or absence of dynamical sensory feedback [6]. However, understanding how active sensing strategies impact on neural dynamics and sensory responses is an open challenge. Whisking behaviour in rodents has been the central model system for studying neural mechanisms of active sensing. During quiet wakefulness the membrane potential of neurons in the barrel cortex exhibit high power, low frequency fluctuations and nearby neurons become highly correlated [3]. As active whisking onsets, the brain state of the barrel cortex qualitatively changes: low frequency fluctuations are suppressed and nearby neurons decorrelate [3]. Interestingly both correlation and low frequency LFP are partially restored during periods of active touch, i.e., when the mouse palpate towards, and repeatedly contacts with an object [3]. The brain state transition is also concomitant with changes in responses of barrel cortex neurons to whisker stimulation. The sensitivity of neurons to whisker perturbation drops with the onset of whisking [5]. However, robust and repeatable whisker response are recovered for more naturalistic stimuli, i.e., active touch [5]. Here we propose a theory, and construct a model, of active whisking that explains the core phenomenology through the dynamical interaction between the brain and the environment. The three core assumptions underlying the theory are: 1. Strong low frequency membrane fluctuations, intra-neural correlations between nearby neurons, and sensory responses to brief whisker deflection associative with quiet attentive state arise as result of network dynamics that are close to a dynamical instability. 2. Reafferent input (sensory feedback related to self-action) during whisking behaviour provides negative feedback to sensory neurons that stabilize cortical dynamics reducing low frequency fluctuations, intra neural correlations, sensory responses to brief whisker deflection, while increasing the correlation between cortical activity and whisker position. 3. Interrupting the reafferent signal, via a whisker touch event, temporally destabilise the cortex and enhances ex-afferent input (external sensory input coding whisker contacts). Our theory and model suggests that sensory feedback via reafference signals is sufficient to account for the changes in cortical dynamics that appear with the onset of active whisking and provides a novel mechanistic account for sensory processing during active whisking. We discuss how this feedback stabilization mechanism coexists and interacts with other internal mechanisms that modulate cortical dynamics within the brain [3,7]. We quantify how this mechanism of active touch enhances the signal-to noise ratio of the input to the cortex. Finally we discuss the wider implications of these results for experimental work attempting to characterise neural activity and responses in an open-loop (in the absence of sensory/motor feedback) condition.

  • Research Article
  • 10.1121/1.2023954
Comparison of simulated peripheral auditory responses to vowels in noise with neural data
  • Dec 1, 1986
  • The Journal of the Acoustical Society of America
  • Karen L Payton

A model of the auditory periphery, designed to fit neural responses to tones, has been shown [K. L. Payton, J. Acoust. Soc. Am. Suppl. 1 77, S80 (1985)] to predict many response characteristics of auditory-nerve fibers to vowel sounds in quiet [M. B. Sachs and E. D. Young, J. Acoust. Soc. Am. 68, 858–875 (1980)]. This paper examines the effects of background noise on model vowel responses and compares them to neural responses [M. B. Sachs, H. F. Voigt, and E. D. Young, J. Neurophysiol. 50, 27–45 (1983)]. In quiet, both the neural and model average rate responses saturate at high stimulus levels. Representation of vowel formants by the ALSR, a temporal measure of neural responses relatively insensitive to level, is also predicted by the model. When white noise is added, at + 10 dB S/N ratio, both the neural and model average rate responses saturate at lower vowel sound levels. However, the presence of background noise degrades the model ALSR responses more than it degrades the neural ALSR responses. [Work supported by NSF.]

  • Research Article
  • Cite Count Icon 153
  • 10.1016/j.neuron.2008.12.003
Similarity Effect and Optimal Control of Multiple-Choice Decision Making
  • Dec 1, 2008
  • Neuron
  • Moran Furman + 1 more

Similarity Effect and Optimal Control of Multiple-Choice Decision Making

  • PDF Download Icon
  • Research Article
  • Cite Count Icon 272
  • 10.1371/journal.pcbi.1003258
Predictive Coding of Dynamical Variables in Balanced Spiking Networks
  • Nov 14, 2013
  • PLoS Computational Biology
  • Martin Boerlin + 2 more

Two observations about the cortex have puzzled neuroscientists for a long time. First, neural responses are highly variable. Second, the level of excitation and inhibition received by each neuron is tightly balanced at all times. Here, we demonstrate that both properties are necessary consequences of neural networks that represent information efficiently in their spikes. We illustrate this insight with spiking networks that represent dynamical variables. Our approach is based on two assumptions: We assume that information about dynamical variables can be read out linearly from neural spike trains, and we assume that neurons only fire a spike if that improves the representation of the dynamical variables. Based on these assumptions, we derive a network of leaky integrate-and-fire neurons that is able to implement arbitrary linear dynamical systems. We show that the membrane voltage of the neurons is equivalent to a prediction error about a common population-level signal. Among other things, our approach allows us to construct an integrator network of spiking neurons that is robust against many perturbations. Most importantly, neural variability in our networks cannot be equated to noise. Despite exhibiting the same single unit properties as widely used population code models (e.g. tuning curves, Poisson distributed spike trains), balanced networks are orders of magnitudes more reliable. Our approach suggests that spikes do matter when considering how the brain computes, and that the reliability of cortical representations could have been strongly underestimated.

  • PDF Download Icon
  • Research Article
  • Cite Count Icon 37
  • 10.1371/journal.pone.0036670
Simple, Fast and Accurate Implementation of the Diffusion Approximation Algorithm for Stochastic Ion Channels with Multiple States
  • May 22, 2012
  • PLoS ONE
  • Patricio Orio + 1 more

BackgroundThe phenomena that emerge from the interaction of the stochastic opening and closing of ion channels (channel noise) with the non-linear neural dynamics are essential to our understanding of the operation of the nervous system. The effects that channel noise can have on neural dynamics are generally studied using numerical simulations of stochastic models. Algorithms based on discrete Markov Chains (MC) seem to be the most reliable and trustworthy, but even optimized algorithms come with a non-negligible computational cost. Diffusion Approximation (DA) methods use Stochastic Differential Equations (SDE) to approximate the behavior of a number of MCs, considerably speeding up simulation times. However, model comparisons have suggested that DA methods did not lead to the same results as in MC modeling in terms of channel noise statistics and effects on excitability. Recently, it was shown that the difference arose because MCs were modeled with coupled gating particles, while the DA was modeled using uncoupled gating particles. Implementations of DA with coupled particles, in the context of a specific kinetic scheme, yielded similar results to MC. However, it remained unclear how to generalize these implementations to different kinetic schemes, or whether they were faster than MC algorithms. Additionally, a steady state approximation was used for the stochastic terms, which, as we show here, can introduce significant inaccuracies.Main ContributionsWe derived the SDE explicitly for any given ion channel kinetic scheme. The resulting generic equations were surprisingly simple and interpretable – allowing an easy, transparent and efficient DA implementation, avoiding unnecessary approximations. The algorithm was tested in a voltage clamp simulation and in two different current clamp simulations, yielding the same results as MC modeling. Also, the simulation efficiency of this DA method demonstrated considerable superiority over MC methods, except when short time steps or low channel numbers were used.

  • Research Article
  • Cite Count Icon 21
  • 10.1016/j.neuroimage.2020.117448
Response certainty during bimanual movements reduces gamma oscillations in primary motor cortex
  • Oct 12, 2020
  • NeuroImage
  • Alex I Wiesman + 4 more

Even when movement outputs are identical, the neural responses supporting them might differ substantially in order to adapt to changing environmental contexts. Despite the essential nature of this adaptive capacity of the human motor system, little is known regarding the effects of contextual response (un)certainty on the neural dynamics known to serve motor processing. In this study, we use a novel bimanual motor task and neuroimaging with magnetoencephalography (MEG) to examine the effects of contextual response certainty on the dynamic neural responses that are important for proper movement. Significant neural responses were identified in the time-frequency domain at the sensor-level and imaged to the cortex using a spectrally resolved beamformer. Combined frequentist and Bayesian statistical testing between neural motor responses under certain and uncertain conditions indicated evidence for no conditional effect on the peri-movement beta desynchronization (18 – 28 Hz; -100 to 300 ms). In contrast, the movement-related gamma synchronization (MRGS; 66 – 86 Hz; -50 to 150 ms) exhibited a robust effect of motor certainty, such that increased contextual response certainty reduced the amplitude of this response. Interestingly, the peak frequency of the MRGS was unaffected by response certainty. These findings both advance our understanding of the neural processes required to adapt our movements under altered environmental contexts, and support the growing conceptualization of the MRGS as being reflective of ongoing higher cognitive processes during movement execution.

  • Components
  • Cite Count Icon 3
  • 10.1371/journal.pbio.3001713.r007
Neural dynamics differentially encode phrases and sentences during spoken language comprehension
  • Jul 14, 2022
  • Kris Dickson + 4 more

Human language stands out in the natural world as a biological signal that uses a structured system to combine the meanings of small linguistic units (e.g., words) into larger constituents (e.g., phrases and sentences). However, the physical dynamics of speech (or sign) do not stand in a one-to-one relationship with the meanings listeners perceive. Instead, listeners infer meaning based on their knowledge of the language. The neural readouts of the perceptual and cognitive processes underlying these inferences are still poorly understood. In the present study, we used scalp electroencephalography (EEG) to compare the neural response to phrases (e.g., the red vase) and sentences (e.g., the vase is red), which were close in semantic meaning and had been synthesized to be physically indistinguishable. Differences in structure were well captured in the reorganization of neural phase responses in delta (approximately <2 Hz) and theta bands (approximately 2 to 7 Hz),and in power and power connectivity changes in the alpha band (approximately 7.5 to 13.5 Hz). Consistent with predictions from a computational model, sentences showed more power, more power connectivity, and more phase synchronization than phrases did. Theta–gamma phase–amplitude coupling occurred, but did not differ between the syntactic structures. Spectral–temporal response function (STRF) modeling revealed different encoding states for phrases and sentences, over and above the acoustically driven neural response. Our findings provide a comprehensive description of how the brain encodes and separates linguistic structures in the dynamics of neural responses. They imply that phase synchronization and strength of connectivity are readouts for the constituent structure of language. The results provide a novel basis for future neurophysiological research on linguistic structure representation in the brain, and, together with our simulations, support time-based binding as a mechanism of structure encoding in neural dynamics.

  • PDF Download Icon
  • Research Article
  • Cite Count Icon 27
  • 10.1371/journal.pbio.3001713
Neural dynamics differentially encode phrases and sentences during spoken language comprehension.
  • Jul 14, 2022
  • PLoS biology
  • Fan Bai + 2 more

Human language stands out in the natural world as a biological signal that uses a structured system to combine the meanings of small linguistic units (e.g., words) into larger constituents (e.g., phrases and sentences). However, the physical dynamics of speech (or sign) do not stand in a one-to-one relationship with the meanings listeners perceive. Instead, listeners infer meaning based on their knowledge of the language. The neural readouts of the perceptual and cognitive processes underlying these inferences are still poorly understood. In the present study, we used scalp electroencephalography (EEG) to compare the neural response to phrases (e.g., the red vase) and sentences (e.g., the vase is red), which were close in semantic meaning and had been synthesized to be physically indistinguishable. Differences in structure were well captured in the reorganization of neural phase responses in delta (approximately <2 Hz) and theta bands (approximately 2 to 7 Hz),and in power and power connectivity changes in the alpha band (approximately 7.5 to 13.5 Hz). Consistent with predictions from a computational model, sentences showed more power, more power connectivity, and more phase synchronization than phrases did. Theta-gamma phase-amplitude coupling occurred, but did not differ between the syntactic structures. Spectral-temporal response function (STRF) modeling revealed different encoding states for phrases and sentences, over and above the acoustically driven neural response. Our findings provide a comprehensive description of how the brain encodes and separates linguistic structures in the dynamics of neural responses. They imply that phase synchronization and strength of connectivity are readouts for the constituent structure of language. The results provide a novel basis for future neurophysiological research on linguistic structure representation in the brain, and, together with our simulations, support time-based binding as a mechanism of structure encoding in neural dynamics.

  • Research Article
  • Cite Count Icon 5
  • 10.1088/1751-8113/40/31/002
The Thouless–Anderson–Palmer equation for an analogue neural network with temporally fluctuating white synaptic noise
  • Jul 19, 2007
  • Journal of Physics A: Mathematical and Theoretical
  • Akihisa Ichiki + 1 more

Effects of synaptic noise on the retrieval process of associative memory neural networks are studied from the viewpoint of neurobiological and biophysical understanding of information processing in the brain. We investigate the statistical mechanical properties of stochastic analogue neural networks with temporally fluctuating synaptic noise, which is assumed to be white noise. Such networks, in general, defy the use of the replica method, since they have no energy concept. The self-consistent signal-to-noise analysis (SCSNA), which is an alternative to the replica method for deriving a set of order parameter equations, requires no energy concept and thus becomes available in studying networks without energy functions. Applying the SCSNA to stochastic networks requires the knowledge of the Thouless–Anderson–Palmer (TAP) equation which defines the deterministic networks equivalent to the original stochastic ones. The study of the TAP equation which is of particular interest for the case without energy concept is very less, while it is closely related to the SCSNA in the case with energy concept. This paper aims to derive the TAP equation for networks with synaptic noise together with a set of order parameter equations by a hybrid use of the cavity method and the SCSNA.

Save Icon
Up Arrow
Open/Close
  • Ask R Discovery Star icon
  • Chat PDF Star icon

AI summaries and top papers from 250M+ research sources.