Related Topics
Articles published on Synaptic noise
Authors
Select Authors
Journals
Select Journals
Duration
Select Duration
405 Search results
Sort by Recency
- Research Article
- 10.1371/journal.pcbi.1012727
- Nov 6, 2025
- PLOS Computational Biology
- Akke Mats Houben + 2 more
An inherent challenge in designing laboratory-grown, engineered living neuronal networks lies in predicting the dynamic repertoire of the resulting network and its sensitivity to experimental variables. To fill this gap, and inspired by recent experimental studies, we present a numerical model designed to replicate the anisotropies in connectivity introduced through engineering, characterize the emergent collective behavior of the neuronal network, and make predictions. The numerical model is developed to replicate experimental data, and subsequently used to quantify network dynamics in relation to tunable structural and dynamical parameters. These include the strength of imprinted anisotropies, synaptic noise, and average axon lengths. We show that the model successfully captures the behavior of engineered neuronal cultures, revealing a rich repertoire of activity patterns that are highly sensitive to connectivity architecture and noise levels. Specifically, the imprinted anisotropies promote modularity and high clustering coefficients, substantially reducing the pathological-like bursting of standard neuronal cultures, whereas noise and axonal length influence the variability in dynamical states and activity propagation velocities. Moreover, connectivity anisotropies significantly enhance the ability to reconstruct structural connectivity from activity data, an aspect that is important to understand the structure-function relationship in neuronal networks. Our work provides a robust in silico framework to assist experimentalists in the design of in vitro neuronal systems and in anticipating their outcomes. This predictive capability is particularly valuable in developing reliable brain-on-a-chip platforms and in exploring fundamental aspects of neural computation, including input-output relationships and information coding.
- Research Article
- 10.3390/e27111125
- Oct 31, 2025
- Entropy (Basel, Switzerland)
- David Dominguez-Carreta + 4 more
In this paper, we explore the storage capacity and maximal information content of a random recurrent neural network characterized by a very low connectivity. A specific set of patterns is embedded into the network according to the Hebb prescription, a fundamental principle in neural learning. We thoroughly examine how various properties of the network, such as its connectivity and the level of synaptic noise, influence its performance and information retention capabilities, which is evaluated through an entropy measure. Our theoretical analyses are complemented by extensive simulations, and the results are validated through comparisons with the retrieval of real biometric patterns, including retinal vessel maps and fingerprints. This comprehensive approach provides deeper insights into the functionality and limitations of finite-connectivity neural networks and their applicability to the retrieval of complex, structured patterns.
- Research Article
- 10.1073/pnas.2422602122
- Oct 31, 2025
- Proceedings of the National Academy of Sciences
- Georgios Iatropoulos + 2 more
Memory consolidation refers to a process of engram reorganization and stabilization that is thought to occur primarily during sleep through a combination of neural replay, homeostatic plasticity, synaptic maturation, and pruning. From a computational perspective, however, this process remains puzzling, as it is unclear how to incorporate the underlying mechanisms into a common mathematical model of learning and memory. Here, we propose a solution by deriving a self-supervised consolidation model that uses replay and two-factor synapses to encode memories in neural networks in a way that maximizes the robustness of cued recall with respect to intrinsic synaptic noise. We show that the dynamics of this optimization make the connectivity sparse and offer a unified account of several experimentally observed signs of consolidation, such as multiplicative homeostatic scaling, task-driven synaptic pruning, increased neural stimulus selectivity, and preferential strengthening of weak memories. The model also reproduces developmental trends in connectivity and stimulus selectivity better than previous models. Finally, it predicts that intrinsic synaptic noise fluctuations should scale sublinearly with synaptic strength; we find support for this in a meta-analysis of published synaptic imaging datasets.
- Research Article
- 10.1038/s41598-025-09114-8
- Sep 30, 2025
- Scientific reports
- Md Azizul Hakim + 1 more
Neural networks face persistent challenges in maintaining stability and robustness during training, particularly in noisy or high-dimensional domains like molecular analysis. Inspired by biological neural systems that leverage homeostasis and self-repair to sustain functionality, this paper proposes BioLogicalNeuron-a novel neural network layer that integrates calcium-driven homeostatic regulation, self-repair mechanisms, and dynamic stability monitoring. The layer mimics biological calcium dynamics to maintain neuronal activity within optimal ranges, proactively triggers targeted synaptic repair and adaptive noise injection to counteract degradation, and modulates learning rates via real-time health metrics. Extensive experiments across multiple molecular and chemical datasets show that BioLogicalNeuron achieves state-of-the-art break performance. The layer's performance is particularly strong on molecular datasets, where its biological mechanisms naturally align with molecular structure learning. Through detailed analysis of calcium dynamics and health-stability relationships, this work demonstrates that BioLogicalNeuron achieves a biologically plausible balance between stability and plasticity, offering insights into both artificial and biological neural networks. This results suggest that incorporating biological mechanisms into neural architectures can lead to more robust and effective learning systems, particularly for molecular and chemical analysis tasks.
- Research Article
- 10.1103/47h5-fbyy
- Sep 1, 2025
- Physical review. E
- Denis S Goldobin + 4 more
Neural dynamics is determined by the transmission of discrete synaptic pulses (synaptic shot noise) among neurons. However, the neural responses are usually obtained within the diffusion approximation modeling synaptic inputs as continuous Gaussian noise. Here we present a rigorous mean-field theory that encompasses synaptic shot noise for sparse balanced inhibitory neural networks driven by an excitatory drive. Our theory predicts alternative dynamical regimes, in agreement with numerical simulations, which are not captured by the classical diffusion approximation. Notably, these regimes feature self-sustained global oscillations emerging at low connectivity (in-degree) via either continuous or hysteretic transitions and characterized by irregular neural activity, as expected for balanced dynamics. For sufficiently weak (strong) excitatory drive (inhibitory feedback) the transition line displays a peculiar reentrant shape revealing the existence of global oscillations at low and high in-degrees, separated by an asynchronous regime at intermediate levels of connectivity. The mechanisms leading to the emergence of these global oscillations are distinct: drift-driven at high connectivity and cluster activation at low connectivity. The frequency of these two kinds of global oscillations can be varied from slow (∼1Hz) to fast (∼100Hz) without altering their microscopic and macroscopic features by adjusting the excitatory drive and the synaptic inhibition strength in a prescribed way. Furthermore, the cluster-activated oscillations at low in-degrees could correspond to the γ rhythms reported in mammalian cortex and hippocampus and attributed to ensembles of inhibitory neurons sharing few synaptic connections [Buzsáki and Wang, Annu. Rev. Neurosci. 35, 203 (2012)0147-006X10.1146/annurev-neuro-062111-150444].
- Research Article
- 10.1103/nwbb-n1bc
- Sep 1, 2025
- Physical review. E
- Sharba Bhattacharjee + 1 more
We study the retrieval accuracy and capacity of modern Hopfield networks of with two-state (Ising) spins interacting via modified Hebbian n-spin interactions. In particular, we consider systems where the interactions deviate from the Hebb rule through additive or multiplicative noise or through clipping or deleting interactions. We find that the capacity scales as N^{n-1} with the number of spins N in all cases, but with a prefactor reduced compared to the Hebbian case. For n=2 our results agree with the previously known results for the conventional n=2 Hopfield network.
- Research Article
- 10.3389/fphys.2025.1590949
- Jul 18, 2025
- Frontiers in physiology
- R Borzuola + 6 more
Superimposing neuromuscular electrical stimulation (NMES) onto voluntary contractions induces specific neuro-physiological adaptations that may have a direct effect on force related outcomes. This study investigated motor unit discharge characteristics and force steadiness following three acute experimental conditions: NMES superimposed onto isometric contractions (NMES + ISO), passive NMES, and isometric contractions only (ISO). Seventeen healthy volunteers participated in the study. Each condition involved 20 intermittent (6s contraction/6s rest) isometric ankle dorsi flexions at 20% of their maximum voluntary contraction (MVIC). NMES was delivered to the tibialis anterior (TA) during NMES and NMES + ISO. High-density surface electromyography (HDsEMG) was used to record myoelectric activity in the TA during steady force-matching contractions, at 10% MVIC, performed immediately after each experimental condition. Motor unit discharge rate (DR) and inter-spike variability (ISIvar) were analyzed from decomposed HDsEMG signals. Coherence analysis was performed to evaluate the strength of common synaptic input across different frequency bands and the proportion of common synaptic input (pCSI) received by spinal motoneurons. Force steadiness was evaluated using the coefficient of variation of force (ForceCoV). NMES + ISO significantly increased motor unit DR compared to baseline and post-intervention NMES. NMES + ISO also induced an increase in pCSI compared to baseline, ISO and NMES. ForceCoV was reduced after NMES + ISO compared to all experimental conditions, indicating improved force steadiness. These results suggest that superimposing NMES onto voluntary contractions can enhance motor unit firing rate and pCSI at low force levels. These adaptations seem to positively contribute to force steadiness, likely by engaging filtering mechanisms which minimize the independent synaptic noise affecting motor control. These findings provide new perspectives on the adaptations induced by NMES exercise, highlighting some of the neuro-physiological mechanisms involved and enriching our knowledge of how the neuromuscular system responds and adapts to NMES-based interventions.
- Research Article
- 10.1103/physrevresearch.7.023172
- May 20, 2025
- Physical Review Research
- Gianni Valerio Vinci + 1 more
Local networks of neurons are nonlinear systems driven by synaptic currents elicited by its own spiking activity and the input received from other brain areas. Synaptic currents are well approximated by correlated Gaussian noise. Besides, the population dynamics of neuronal networks is often found to be multistable, allowing the noise source to induce state transitions. State changes in neuronal systems underlies the way information is encoded and transformed. The characterization of the escape time from metastable states is then a cornerstone to understand how information is processed in the brain. The effects of correlated input forcing bistable systems have been studied for over half a century, nonetheless most results are perturbative or valid only when a separation of time scales is present. Here, we present a novel and exact result holding when the correlation time of the noise source is identical to that of the neural population, hence solving in a very general setting the mean escape time problem. Published by the American Physical Society 2025
- Research Article
- 10.7242/2658-705x/2024.4.3
- Feb 27, 2025
- Perm Scientific Center Journal
- М.В Агеева + 1 more
In 2024, the Nobel Prize in Physics was awarded for work that laid the foundation for the development of machine learning based on artificial neural networks. This event can be considered a public recognition of the role of mathematical models like the Hopfield model and the use of the mathematical apparatus of statistical physics and quantum mechanics to describe collective dynamicsin them. Despite the fact that neuronal network dynamics is mediated by discrete synaptic signals,the theoretical description of the endogenous noise in such networks is constructed within the framework of the diffusion approximation. This approach has a significant drawback, since in fact a discrete set of signals is represented as continuous Gaussian noise. It turns out that the result of this approach is the actual “blindness” of the obtained equations to some regimes of collective dynamics of the system: in particular, to the possibility of hysteresis transitions between asynchronous and oscillatory dynamics in a balanced neural network with a sparse net of connections. The paper describes a recently introduced full mean-field formalism that takes into account the effective synaptic shot noise in a sparse network of spiking neurons. Two mechanisms of global oscillations in the system depending on the degree of network sparsity are also found and explained. The developed formalism was tested on two models of neuron dynamics: quadratic integrate-and-fire neurons and the Morris–Lecar model
- Research Article
- 10.1152/jn.00444.2024
- Feb 24, 2025
- Journal of neurophysiology
- Alexandre Melanson + 3 more
The stochastic flickering of ion channels is known to cause ongoing membrane potential fluctuations in neurons. This channel noise is often considered negligible when compared with synaptic noise, yet it can shape the integrative properties of neurons. Here, in vitro recordings of electrosensory pyramidal neurons under synaptic blockade are characterized and shown to contain a nontrivial repertoire of dynamical features. Our analyses reveal an intrinsic noise structure that is much richer than what could be expected based on previous studies: we identify rapid, small-amplitude, shot noise-like events and we quantify how their rate and amplitude are modulated by slower, large-amplitude fluctuations. This cross-relation is evidence that, at the single-neuron level, membrane potential dynamics can exhibit a form of phase-amplitude coupling. We also investigate the appearance of fast, intermittent subthreshold oscillations and conclude that they are manifestation of stochastic linear dynamics, possibly with time-varying parameters. Our results, collectively, highlight that neurons can spontaneously display rich intrinsic behavior, which is likely to impact how they process synaptic input.NEW & NOTEWORTHY How do neurons behave in the absence of synaptic input? Can their intrinsic activity convey important information about how they function? Here, we provide evidence that the structure of intrinsic voltage noise in pyramidal neurons contains several nontrivial components, contrary to what is usually assumed. We show, for the first time, that a form of phase-amplitude coupling can exist in the spontaneous electrical activity of single neurons.
- Research Article
- 10.1080/02640414.2025.2462356
- Feb 7, 2025
- Journal of Sports Sciences
- Chrysi Tsiouri + 7 more
ABSTRACT Our purpose was to compare the influence of motor unit activity in Flexor Digitorum Brevis (FDB) and Soleus (SOL) on force fluctuations during three forward-leaning tasks. Ground reaction forces and high-density EMG signals were collected from 19 males when leaning forward at 25%, 50%, and 75% of maximal forward leaning force. EMG amplitude increased with percent of leaning and was greater for SOL than FDB, but there were no differences in force fluctuations across tasks. Differences in motor unit activity indicated that the relative contribution of the two muscles to the control of balance varied across tasks as confirmed by the association between the fluctuations in neural drive [standard deviation of the filtered cumulative spike train (SD of fCST)] and force [coefficient of variation (CoV) for force]. Specifically, the correlation values were greater for FDB at the lower target forces. Correlation analyses revealed that synaptic noise (CoV for interspike interval) was weakly correlated with the CoV for force, whereas the variability in shared synaptic input (SD of fCST) was strongly correlated with the CoV for force. This finding suggests that the relative influence of the two muscles on the fluctuations in force during forward leaning varied with task requirements.
- Research Article
4
- 10.1371/journal.pcbi.1012262
- Dec 13, 2024
- PLoS computational biology
- Jorin Overwiening + 3 more
The thalamus is the brain's central relay station, orchestrating sensory processing and cognitive functions. However, how thalamic function depends on internal and external states, is not well understood. A comprehensive understanding would necessitate the integration of single cell dynamics with their collective behavior at population level. For this we propose a biologically realistic mean-field model of the thalamus, describing thalamocortical relay neurons (TC) and thalamic reticular neurons (RE). We perform a multi-scale study of thalamic responsiveness and its dependence on cell and brain states. Building upon existing single-cell experiments we show that: (1) Awake and sleep-like states can be defined via the absence/presence of the neuromodulator acetylcholine (ACh), which indirectly controls bursting in TC and RE. (2) Thalamic response to sensory stimuli is linear in awake state and becomes nonlinear in sleep state, while cortical input generates nonlinear response in both awake and sleep state. (3) Stimulus response is controlled by cortical input, which suppresses responsiveness in awake state while it 'wakes-up' the thalamus in sleep state promoting a linear response. (4) Synaptic noise induces a global linear responsiveness, diminishing the difference in response between thalamic states. Finally, the model replicates spindle oscillations within a sleep-like state, exhibiting a qualitative change in activity and responsiveness. The development of this thalamic mean-field model provides a new tool for incorporating detailed thalamic dynamics in large scale brain simulations.
- Research Article
3
- 10.1103/physrevlett.133.238401
- Dec 6, 2024
- Physical review letters
- Denis S Goldobin + 2 more
Despite the fact that neural dynamics is triggered by discrete synaptic events, the neural response is usually obtained within the diffusion approximation representing the synaptic inputs as Gaussian noise. We derive a mean-field formalism encompassing synaptic shot noise for sparse balanced neural networks. For low (high) excitatory drive (inhibitory feedback) global oscillations emerge via continuous or hysteretic transitions, correctly predicted by our approach, but not from the diffusion approximation. At sufficiently low in-degrees the nature of these global oscillations changes from drift driven to cluster activation.
- Research Article
- 10.3389/fneur.2024.1471118
- Nov 18, 2024
- Frontiers in Neurology
- Selina Baeza-Loya + 1 more
Vestibular afferent neurons occur as two populations with differences in spike timing regularity that are independent of rate. The more excitable regular afferents have lower current thresholds and sustained spiking responses to injected currents, while irregular afferent neurons have higher thresholds and transient responses. Differences in expression of low-voltage-activated potassium (KLV) channels are emphasized in models of spiking regularity and excitability in these neurons, leaving open the potential contributions of the voltage-gated sodium (NaV) channels responsible for the spike upstroke. We investigated the impact of different NaV current modes (transient, persistent, and resurgent) with whole-cell patch clamp experiments in mouse vestibular ganglion neurons (VGNs), the cultured and dissociated cell bodies of afferents. All VGNs had transient NaV current, many had a small persistent (non-inactivating) NaV current, and a few had resurgent current, which flows after the spike when NaV channels that were blocked are unblocked. A known NaV1.6 channel blocker decreased spike rate and altered spike waveforms in both sustained and transient VGNs and affected all three modes of NaV current. A NaV channel agonist enhanced persistent current and increased spike rate and regularity. We hypothesized that persistent and resurgent currents have different effects on sustained (regular) VGNs vs. transient (irregular) VGNs. Lacking blockers specific for the different current modes, we used modeling to isolate their effects on spiking of simulated transient and sustained VGNs, driven by simulated current steps and noisy trains of simulated EPSCs. In all simulated neurons, increasing transient NaV current increased spike rate and rate-independent regularity. In simulated sustained VGNs, adding persistent current increased both rate and rate-independent regularity, while adding resurgent current had limited impact. In transient VGNs, adding persistent current had little impact, while adding resurgent current increased both rate and rate-independent irregularity by enhancing sensitivity to synaptic noise. These experiments show that the small NaV current modes may enhance the differentiation of afferent populations, with persistent currents selectively making regular afferents more regular and resurgent currents selectively making irregular afferents more irregular.
- Research Article
4
- 10.1063/5.0225760
- Nov 1, 2024
- Chaos (Woodbury, N.Y.)
- Marius E Yamakou + 2 more
Inverse stochastic resonance (ISR) is a counterintuitive phenomenon where noise reduces the oscillation frequency of an oscillator to a minimum occurring at an intermediate noise intensity, and sometimes even to the complete absence of oscillations. In neuroscience, ISR was first experimentally verified with cerebellar Purkinje neurons [Buchin et al., PLOS Comput. Biol. 12, e1005000 (2016)]. These experiments showed that ISR enables a locally optimal information transfer between the input and output spike train of neurons. Subsequent studies have further demonstrated the efficiency of information processing and transfer in neural networks with small-world network topology. We have conducted a numerical investigation into the impact of adaptivity on ISR in a small-world network of noisy FitzHugh-Nagumo (FHN) neurons, operating in a bi-metastable regime consisting of a metastable fixed point and a metastable limit cycle. Our results show that the degree of ISR is highly dependent on the value of the FHN model's timescale separation parameter ε. The network structure undergoes dynamic adaptation via mechanisms of either spike-time-dependent plasticity (STDP) with potentiation-/depression-domination parameter P or homeostatic structural plasticity (HSP) with rewiring frequency F. We demonstrate that both STDP and HSP amplify the effect of ISR when ε lies within the bi-stability region of FHN neurons. Specifically, at larger values of ε within the bi-stability regime, higher rewiring frequencies F are observed to enhance ISR at intermediate (weak) synaptic noise intensities, while values of P consistent with depression-domination (potentiation-domination) consistently enhance (deteriorate) ISR. Moreover, although STDP and HSP control parameters may jointly enhance ISR, P has a greater impact on improving ISR compared to F. Our findings inform future ISR enhancement strategies in noisy artificial neural circuits, aiming to optimize local information transfer between input and output spike trains in neuromorphic systems and prompt venues for experiments in neural networks.
- Research Article
1
- 10.1016/j.apm.2024.115718
- Sep 18, 2024
- Applied Mathematical Modelling
- Lianghui Qu + 4 more
Deterministic analysis of stochastic FHN systems based on Gaussian decoupling
- Research Article
- 10.1101/2023.11.28.569044
- Jul 27, 2024
- bioRxiv
- Selina Baeza-Loya + 1 more
Vestibular afferent neurons occur as two populations with differences in spike timing regularity that are independent of rate. The more excitable regular afferents have lower current thresholds and sustained spiking responses to injected currents, while irregular afferent neurons have higher thresholds and transient responses. Differences in expression of low-voltage-activated potassium (KLV) channels are emphasized in models of spiking regularity and excitability in these neurons, leaving open the potential contributions of the voltage-gated sodium (NaV) channels responsible for the spike upstroke. We investigated the impact of different NaVcurrent modes (transient, persistent, and resurgent) with whole-cell patch clamp experiments in mouse vestibular ganglion neurons (VGNs), the cultured and dissociated cell bodies of afferents. All VGNs had transient NaVcurrent, many had a small persistent (non-inactivating) NaVcurrent, and a few had resurgent current, which flows after the spike peak when NaVchannels that were blocked are unblocked. NaV1.6 channels conducted most or all of each NaVcurrent mode, and a NaV1.6-selective blocker decreased spike rate and altered spike waveforms in both sustained and transient VGNs. A NaVchannel agonist enhanced persistent current and increased spike rate and regularity. We hypothesized that persistent and resurgent currents have different effects on sustained (regular) VGNs vs. transient (irregular) VGNs. Lacking blockers specific for the different current modes, we used modeling to isolate their effects on spiking of simulated transient and sustained VGNs, driven by simulated current steps and noisy trains of simulated EPSCs. In all simulated neurons, increasing transient NaVcurrent increased spike rate and rate-independent regularity. In simulated sustained VGNs, adding persistent current increased both rate and rate-independent regularity, while adding resurgent current had limited impact. In transient VGNs, adding persistent current had little impact, while adding resurgent current increased both rate and rate-independent irregularity by enhancing sensitivity to synaptic noise. These experiments show that the small NaVcurrent modes may enhance the differentiation of afferent populations, with persistent currents selectively making regular afferents more regular and resurgent currents selectively making irregular afferents less regular.
- Research Article
7
- 10.3390/math12081149
- Apr 11, 2024
- Mathematics
- Chitaranjan Mahapatra + 1 more
We developed a mathematical model to simulate the dynamics of background synaptic noise in non-neuronal cells. By employing the stochastic Ornstein–Uhlenbeck process, we represented excitatory synaptic conductance and integrated it into a whole-cell model to generate spontaneous and evoke cellular electrical activities. This single-cell model encompasses numerous biophysically detailed ion channels, depicted by a set of ordinary differential equations in Hodgkin–Huxley and Markov formalisms. Consequently, this approach effectively induced irregular spontaneous depolarizations (SDs) and spontaneous action potentials (sAPs), resembling electrical activity observed in vitro. The input resistance decreased significantly, while the firing rate of spontaneous action potentials increased. Moreover, alterations in the ability to reach the action potential threshold were observed. Background synaptic activity can modify the input/output characteristics of non-neuronal excitatory cells. Hence, suppressing these baseline activities could aid in identifying new pharmaceutical targets for various clinical diseases.
- Research Article
2
- 10.1103/physreve.109.024407
- Feb 15, 2024
- Physical Review E
- Magnus J E Richardson
The steady-state firing rate and firing-rate response of the leaky and exponential integrate-and-fire models receiving synaptic shot noise with excitatory and inhibitory reversal potentials is examined. For the particular case where the underlying synaptic conductances are exponentially distributed, it is shown that the master equationfor a population of such model neurons can be reduced from an integrodifferential form to a more tractable set of three differential equations. The system is nevertheless more challenging analytically than for current-based synapses: where possible, analytical results are provided with an efficient numerical scheme and code provided for other quantities. The increased tractability of the framework developed supports an ongoing critical comparison between models in which synapses are treated with and without reversal potentials, such as recently in the context of networks with balanced excitatory and inhibitory conductances.
- Research Article
2
- 10.3934/era.2024033
- Jan 1, 2024
- Electronic Research Archive
- Feibiao Zhan + 2 more
<abstract><p>In this paper, we explore the mechanisms of central pattern generators (CPGs), circuits that can generate rhythmic patterns of motor activity without external input. We study the half-center oscillator, a simple form of CPG circuit consisting of neurons connected by reciprocally inhibitory synapses. We examine the role of asymmetric coupling factors in shaping rhythm activity and how different network topologies contribute to network efficiency. We have discovered that neurons with lower synaptic strength are more susceptible to noise that affects rhythm changes. Our research highlights the importance of asymmetric coupling factors, noise, and other synaptic parameters in shaping the broad regimes of CPG rhythm. Finally, we compare three topology types' regular regimes and provide insights on how to locate the rhythm activity.</p></abstract>