Abstract

Self-organization is thought to play an important role in structuring nervous systems. It frequently arises as a consequence of plasticity mechanisms in neural networks: connectivity determines network dynamics which in turn feed back on network structure through various forms of plasticity. Recently, self-organizing recurrent neural network models (SORNs) have been shown to learn non-trivial structure in their inputs and to reproduce the experimentally observed statistics and fluctuations of synaptic connection strengths in cortex and hippocampus. However, the dynamics in these networks and how they change with network evolution are still poorly understood. Here we investigate the degree of chaos in SORNs by studying how the networks' self-organization changes their response to small perturbations. We study the effect of perturbations to the excitatory-to-excitatory weight matrix on connection strengths and on unit activities. We find that the network dynamics, characterized by an estimate of the maximum Lyapunov exponent, becomes less chaotic during its self-organization, developing into a regime where only few perturbations become amplified. We also find that due to the mixing of discrete and (quasi-)continuous variables in SORNs, small perturbations to the synaptic weights may become amplified only after a substantial delay, a phenomenon we propose to call deferred chaos.

Highlights

  • A fundamental question in Neuroscience is how cortical circuits acquire the structure required to perform desired computations

  • We characterize the degree of chaos by an estimate of the maximum Lyapunov exponent and study it at different stages of network evolution

  • We introduce a structural plasticity (SP) mechanism to compensate for the synapse elimination induced by spike-timing dependent plasticity (STDP)

Read more

Summary

Introduction

A fundamental question in Neuroscience is how cortical circuits acquire the structure required to perform desired computations. Recent modeling work has shown that recurrent spiking neural networks with multiple forms of plasticity can learn interesting representations of sensory inputs [1] and reproduce experimental data on the statistics and fluctuations of synaptic connection strengths in cortex and hippocampus [2]. These self-organizing recurrent networks (SORNs) rely on an interplay of spike-timing dependent plasticity (STDP) and different homeostatic mechanisms. We characterize the degree of chaos by an estimate of the maximum Lyapunov exponent and study it at different stages of network evolution

Methods
Results
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.