Abstract

This paper is concerned with the modeling and analysis of two of the most commonly used recurrent neural network models (i.e., Hopfield neural network and firing-rate neural network) with dynamic recurrent connections undergoing Hebbian learning rules. To capture the synaptic sparsity of neural circuits we propose a low dimensional formulation. We then characterize certain key dynamical properties. First, we give biologically-inspired forward invariance results. Then, we give sufficient conditions for the non-Euclidean contractivity of the models. Our contraction analysis leads to stability and robustness of time-varying trajectories — for networks with both excitatory and inhibitory synapses governed by both Hebbian and anti-Hebbian rules. For each model, we propose a contractivity test based upon biologically meaningful quantities, e.g., neural and synaptic decay rate, maximum in-degree, and the maximum synaptic strength. Then, we show that the models satisfy Dale’s Principle. Finally, we illustrate the effectiveness of our results via a numerical example.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.