Modeling Higher-Order Interactions in Sparse and Heavy-Tailed Neural Population Activity.

  • Abstract
  • Literature Map
  • Similar Papers
Abstract
Translate article icon Translate Article Star icon
Take notes icon Take Notes

Neurons process sensory stimuli efficiently, showing sparse yet highly variable ensemble spiking activity involving structured higher-order interactions. Notably, while neural populations are mostly silent, they occasionally exhibit highly synchronous activity, resulting in sparse and heavy-tailed spike-count distributions. However, its mechanistic origin-specifically, what types of nonlinear properties in individual neurons induce such population-level patterns-remains unclear. In this study, we derive sufficient conditions under which the joint activity of homogeneous binary neurons generates sparse and widespread population firing rate distributions in infinitely large networks. We then propose a subclass of exponential family distributions that satisfy this condition. This class incorporates structured higher-order interactions with alternating signs and shrinking magnitudes, along with a base-measure function that offsets distributional concentration, giving rise to parameter-dependent sparsity and heavy-tailed population firing rate distributions. Analysis of recurrent neural networks that recapitulate these distributions reveals that individual neurons possess threshold-like nonlinearity, followed by supralinear activation that jointly facilitates sparse and synchronous population activity. These nonlinear features resemble those in modern Hopfield networks, suggesting a connection between widespread population activity and the network's memory capacity. The theory establishes sparse and heavy-tailed distributions for binary patterns, forming a foundation for developing energy-efficient spike-based learning machines.

Similar Papers
  • Conference Article
  • Cite Count Icon 2
  • 10.1109/ijcnn.1992.227271
Analysis of learning recurrent neural networks: connective stability and equilibrium manifold
  • Jun 7, 1992
  • H.C Tseng + 1 more

Stability analysis of recurrent neural networks with a learning rule based on the concept of an equilibrium manifold is considered. Recurrent neural networks with learning rules have changing equilibria during the learning process. The authors design a learning rule that enables the recurrent neural network to store a desired pattern based on the concept of the equilibrium manifold. A stability criterion for the learning neural network is established and is a function of the learning rate, a sigmoid function and the upper bound of the interconnection strength. >

  • Research Article
  • Cite Count Icon 602
  • 10.1109/tnnls.2014.2317880
A Comprehensive Review of Stability Analysis of Continuous-Time Recurrent Neural Networks
  • Jul 1, 2014
  • IEEE Transactions on Neural Networks and Learning Systems
  • Huaguang Zhang + 2 more

Stability problems of continuous-time recurrent neural networks have been extensively studied, and many papers have been published in the literature. The purpose of this paper is to provide a comprehensive review of the research on stability of continuous-time recurrent neural networks, including Hopfield neural networks, Cohen-Grossberg neural networks, and related models. Since time delay is inevitable in practice, stability results of recurrent neural networks with different classes of time delays are reviewed in detail. For the case of delay-dependent stability, the results on how to deal with the constant/variable delay in recurrent neural networks are summarized. The relationship among stability results in different forms, such as algebraic inequality forms, M-matrix forms, linear matrix inequality forms, and Lyapunov diagonal stability forms, is discussed and compared. Some necessary and sufficient stability conditions for recurrent neural networks without time delays are also discussed. Concluding remarks and future directions of stability analysis of recurrent neural networks are given.

  • Research Article
  • Cite Count Icon 1
  • 10.1515/auto-2022-0032
Robustness analysis and training of recurrent neural networks using dissipativity theory
  • Aug 4, 2022
  • at - Automatisierungstechnik
  • Patricia Pauli + 2 more

Neural networks are widely applied in control applications, yet providing safety guarantees for neural networks is challenging due to their highly nonlinear nature. We provide a comprehensive introduction to the analysis of recurrent neural networks (RNNs) using robust control and dissipativity theory. Specifically, we consider H 2 {\mathcal{H}_{2}} -performance and the ℓ 2 {\ell _{2}} -gain to quantify the robustness of dynamic RNNs with respect to input perturbations. First, we analyze the robustness of RNNs using the proposed robustness certificates and then, we present linear matrix inequality constraints to be used in training of RNNs to enforce robustness. Finally, we illustrate in a numerical example that the proposed approach enhances the robustness of RNNs.

  • Research Article
  • Cite Count Icon 30
  • 10.1016/j.neucom.2016.04.052
Stability analysis of recurrent neural networks with interval time-varying delay via free-matrix-based integral inequality
  • May 13, 2016
  • Neurocomputing
  • Wen-Juan Lin + 4 more

Stability analysis of recurrent neural networks with interval time-varying delay via free-matrix-based integral inequality

  • Research Article
  • Cite Count Icon 8
  • 10.1016/j.ejcon.2021.06.022
Formula omitted] induced norm analysis of discrete-time LTI systems for nonnegative input signals and its application to stability analysis of recurrent neural networks
  • Jul 10, 2021
  • European Journal of Control
  • Yoshio Ebihara + 5 more

formula omitted] induced norm analysis of discrete-time LTI systems for nonnegative input signals and its application to stability analysis of recurrent neural networks

  • Research Article
  • Cite Count Icon 47
  • 10.1109/tnnls.2021.3105519
An Overview of the Stability Analysis of Recurrent Neural Networks With Multiple Equilibria.
  • Mar 1, 2023
  • IEEE Transactions on Neural Networks and Learning Systems
  • Peng Liu + 2 more

The stability analysis of recurrent neural networks (RNNs) with multiple equilibria has received extensive interest since it is a prerequisite for successful applications of RNNs. With the increasing theoretical results on this topic, it is desirable to review the results for a systematical understanding of the state of the art. This article provides an overview of the stability results of RNNs with multiple equilibria including complete stability and multistability. First, preliminaries on the complete stability and multistability analysis of RNNs are introduced. Second, the complete stability results of RNNs are summarized. Third, the multistability results of various RNNs are reviewed in detail. Finally, future directions in these interesting topics are suggested.

  • Research Article
  • Cite Count Icon 37
  • 10.1016/j.neunet.2017.09.013
Multistability and instability analysis of recurrent neural networks with time-varying delays
  • Oct 14, 2017
  • Neural Networks
  • Fanghai Zhang + 1 more

Multistability and instability analysis of recurrent neural networks with time-varying delays

  • Research Article
  • Cite Count Icon 32
  • 10.1142/s025295990400041x
ON PERIODIC DYNAMICAL SYSTEMS
  • Oct 1, 2004
  • Chinese Annals of Mathematics
  • Wenlian Lu + 1 more

The authors investigate the existence and the global stability of periodic solution for dynamical systems with periodic interconnections, inputs and self-inhibitions. The model is very general, the conditions are quite weak and the results obtained are universal.

  • Conference Article
  • Cite Count Icon 5
  • 10.1109/cdc45484.2021.9683530
Stability Analysis of Recurrent Neural Networks by IQC with Copositive Mutipliers
  • Dec 14, 2021
  • Yoshio Ebihara + 5 more

This paper is concerned with the stability analysis of the recurrent neural networks (RNNs) by means of the integral quadratic constraint (IQC) framework. The rectified linear unit (ReLU) is typically employed as the activation function of the RNN, and the ReLU has specific nonnegativity properties regarding its input and output signals. Therefore, it is effective if we can derive IQC-based stability conditions with multipliers taking care of such nonnegativity properties. However, such nonnegativity (linear) properties are hardly captured by the existing multipliers defined on the positive semidefinite cone. To get around this difficulty, we loosen the standard positive semidefinite cone to the copositive cone, and employ copositive multipliers to capture the nonnegativity properties. We show that, within the framework of the IQC, we can employ copositive multipliers (or their inner approximation) together with existing multipliers such as Zames-Falb multipliers and polytopic bounding multipliers, and this directly enables us to ensure that the introduction of the copositive multipliers leads to better (no more conservative) results. We finally illustrate the effectiveness of the IQC-based stability conditions with the copositive multipliers by numerical examples.

  • Research Article
  • Cite Count Icon 5
  • 10.1038/s41598-022-11032-y
The impact of pitolisant, an H3 receptor antagonist/inverse agonist, on perirhinal cortex activity in individual neuron and neuronal population levels
  • May 12, 2022
  • Scientific Reports
  • Kyosuke Hirano + 3 more

Histamine is a neurotransmitter that modulates neuronal activity and regulates various brain functions. Histamine H3 receptor (H3R) antagonists/inverse agonists enhance its release in most brain regions, including the cerebral cortex, which improves learning and memory and exerts an antiepileptic effect. However, the mechanism underlying the effect of H3R antagonists/inverse agonists on cortical neuronal activity in vivo remains unclear. Here, we show the mechanism by which pitolisant, an H3R antagonist/inverse agonist, influenced perirhinal cortex (PRh) activity in individual neuron and neuronal population levels. We monitored neuronal activity in the PRh of freely moving mice using in vivo Ca2+ imaging through a miniaturized one-photon microscope. Pitolisant increased the activity of some PRh neurons while decreasing the activity of others without affecting the mean neuronal activity across neurons. Moreover, it increases neuron pairs with synchronous activity in excitatory-responsive neuronal populations. Furthermore, machine learning analysis revealed that pitolisant altered the neuronal population activity. The changes in the population activity were dependent on the neurons that were excited and inhibited by pitolisant treatment. These findings indicate that pitolisant influences the activity of a subset of PRh neurons by increasing the synchronous activity and modifying the population activity.

  • Conference Article
  • Cite Count Icon 1
  • 10.1109/biocas.2018.8584741
High-Capacity Fingerprint Recognition System based on a Dynamic Memory-Capacity Estimation Technique
  • Oct 1, 2018
  • Pavan Kumar Chundi + 3 more

Estimating the current memory capacity of a neural network based recognition system is critical to maximally use the available memory capacity in memorizing new inputs without exceeding the limit of the capacity (catastrophic forgetting). In this paper, we propose a dynamic approach to monitoring a network's memory capacity. Prior works in this area have presented static expressions dependent on neuron count N, forcing to assume the worst-case input characteristics for bias and correlation when setting the capacity of the network. Instead, our technique operates simultaneously with the learning of a Hopfield network and concludes with a capacity estimate based on the patterns which were stored. By continuously updating the crosstalk associated with the stored patterns, our model guards the network against overwriting its memory traces and exceeding its capacity. We designed a fingerprint recognition system based on our dynamic estimation technique. With the experiment using NIST Special Database 10, the system achieves 2.7 to 8X larger memory-capacity as compared to the baseline systems using the static capacity estimates.

  • Research Article
  • Cite Count Icon 55
  • 10.1002/cne.1047
Intrinsic connectivity of the rat subiculum: II. Properties of synchronous spontaneous activity and a demonstration of multiple generator regions.
  • Jun 8, 2001
  • Journal of Comparative Neurology
  • Elana Harris + 1 more

Brain structures that can generate epileptiform activity possess excitatory interconnections among principal cells and a subset of these neurons that can be spontaneously active ("pacemaker" cells). We describe electrophysiological evidence for excitatory interactions among rat subicular neurons. Subiculum was isolated from presubiculum, CA1, and entorhinal cortex in ventral horizontal slices. Nominally zero magnesium perfusate, picrotoxin (100 microM), or NMDA (20 microM) was used to induce spontaneous firing in subicular neurons. Synchronous population activity and the spread of population events from one end of subiculum to the other in isolated subicular subslices indicate that subicular pyramidal neurons are coupled together by excitatory synapses. Both electrophysiological classes of subicular pyramidal cells (bursting and regular spiking) exhibited synchronous activity, indicating that both cell classes are targets of local excitatory inputs. Burst firing neurons were active in the absence of synchronous activity in field recordings, indicating that these cells may serve as pacemaker neurons for the generation of epileptiform activity in subiculum. Epileptiform events could originate at either proximal or distal segments of the subiculum from ventral horizontal slices. In some slices, events originated in both proximal and distal locations and propagated to the other location. Finally, propagation was supported over axonal paths through the cell layer and in the apical dendritic zone. We conclude that subicular burst firing and regular spiking neurons are coupled by means of glutamatergic synapses. These connections may serve to distribute activity driven by topographically organized inputs and to synchronize subicular cell activity.

  • PDF Download Icon
  • Research Article
  • Cite Count Icon 8
  • 10.1155/2010/191546
Stability Analysis of Recurrent Neural Networks with Random Delay and Markovian Switching
  • Jan 1, 2010
  • Journal of Inequalities and Applications
  • Enwen Zhu + 4 more

In this paper, the exponential stability analysis problem is considered for a class of recurrent neural networks (RNNs) with random delay and Markovian switching. The evolution of the delay is modeled by a continuous-time homogeneous Markov process with a finite number of states. The main purpose of this paper is to establish easily verifiable conditions under which the random delayed recurrent neural network with Markovian switching is exponentially stable. The analysis is based on the Lyapunov-Krasovskii functional and stochastic analysis approach, and the conditions are expressed in terms of linear matrix inequalities, which can be readily checked by using some standard numerical packages such as the Matlab LMI Toolbox. A numerical example is exploited to show the usefulness of the derived LMI-based stability conditions.

  • Research Article
  • Cite Count Icon 17
  • 10.1364/ol.472267
Design and analysis of recurrent neural networks for ultrafast optical pulse nonlinear propagation.
  • Oct 18, 2022
  • Optics Letters
  • Gustavo R Martins + 4 more

In this work, we analyze different types of recurrent neural networks (RNNs) working under several different parameters to best model the nonlinear optical dynamics of pulse propagation. Here we studied the propagation of picosecond and femtosecond pulses under distinct initial conditions going through 13 m of a highly nonlinear fiber and demonstrated the application of two RNNs returning error metrics such as normalized root mean squared error (NRMSE) as low as 9%. Those results were further extended for a dataset outside the initial pulse conditions used on the RNN training, and the best-proposed network was still able to achieve a NRMSE below 14%. We believe that this study can contribute to a better understanding of building RNNs employed for modeling nonlinear optical pulse propagation and of how the peak power and nonlinearity affect the prediction error.

  • Research Article
  • Cite Count Icon 12
  • 10.1016/j.tcs.2003.09.006
Global exponential convergence of recurrent neural networks with variable delays
  • Sep 28, 2003
  • Theoretical Computer Science
  • Zhang Yi

Global exponential convergence of recurrent neural networks with variable delays

More from: Neural computation
  • New
  • Research Article
  • 10.1162/neco.a.36
Working Memory and Self-Directed Inner Speech Enhance Multitask Generalization in Active Inference.
  • Oct 29, 2025
  • Neural computation
  • Jeffrey Frederic Queißer + 1 more

  • New
  • Research Article
  • 10.1162/neco.a.37
Model Predictive Control on the Neural Manifold.
  • Oct 29, 2025
  • Neural computation
  • Christof Fehrman + 1 more

  • New
  • Research Article
  • 10.1162/neco.a.38
Unsupervised Learning in Echo State Networks for Input Reconstruction.
  • Oct 29, 2025
  • Neural computation
  • Taiki Yamada + 2 more

  • Research Article
  • 10.1162/neco.a.27
Feature Normalization Prevents Collapse of Noncontrastive Learning Dynamics.
  • Oct 10, 2025
  • Neural computation
  • Han Bao

  • Research Article
  • 10.1162/neco.a.35
Modeling Higher-Order Interactions in Sparse and Heavy-Tailed Neural Population Activity.
  • Oct 10, 2025
  • Neural computation
  • Ulises Rodríguez-Domínguez + 1 more

  • Research Article
  • 10.1162/neco.a.34
A Chimera Model for Motion Anticipation in the Retina and the Primary Visual Cortex.
  • Oct 10, 2025
  • Neural computation
  • Jérôme Emonet + 5 more

  • Research Article
  • 10.1162/neco.a.30
Encoding of Numerosity With Robustness to Object and Scene Identity in Biologically Inspired Object Recognition Networks.
  • Oct 10, 2025
  • Neural computation
  • Thomas Chapalain + 2 more

  • Research Article
  • 10.1162/neco.a.28
Firing Rate Models as Associative Memory: Synaptic Design for Robust Retrieval.
  • Sep 22, 2025
  • Neural computation
  • Simone Betteti + 3 more

  • Research Article
  • 10.1162/neco.a.29
Transformer Models for Signal Processing: Scaled Dot-Product Attention Implements Constrained Filtering.
  • Sep 22, 2025
  • Neural computation
  • Terence D Sanger

  • Research Article
  • 10.1162/neco.a.26
Rapid Reweighting of Sensory Inputs and Predictions in Visual Perception.
  • Sep 22, 2025
  • Neural computation
  • William Turner + 3 more

Save Icon
Up Arrow
Open/Close
  • Ask R Discovery Star icon
  • Chat PDF Star icon

AI summaries and top papers from 250M+ research sources.

Search IconWhat is the difference between bacteria and viruses?
Open In New Tab Icon
Search IconWhat is the function of the immune system?
Open In New Tab Icon
Search IconCan diabetes be passed down from one generation to the next?
Open In New Tab Icon