Abstract

Autonomous randomly coupled neural networks display a transition to chaos at a critical coupling strength. We here investigate the effect of a time-varying input on the onset of chaos and the resulting consequences for information processing. Dynamic mean-field theory yields the statistics of the activity, the maximum Lyapunov exponent, and the memory capacity of the network. We find an exact condition that determines the transition from stable to chaotic dynamics and the sequential memory capacity in closed form. The input suppresses chaos by a dynamic mechanism, shifting the transition to significantly larger coupling strengths than predicted by local stability analysis. Beyond linear stability, a regime of coexistent locally expansive, but non-chaotic dynamics emerges that optimizes the capacity of the network to store sequential input.

Highlights

  • Large random networks of neuronlike units can exhibit collective chaotic dynamics [1,2,3,4]

  • Dynamic mean-field theory yields the statistics of the activity, the maximum Lyapunov exponent, and the memory capacity of the network

  • We find an exact condition that determines the transition from stable to chaotic dynamics and the sequential memory capacity in closed form

Read more

Summary

INTRODUCTION

Large random networks of neuronlike units can exhibit collective chaotic dynamics [1,2,3,4]. The mechanism is only understood for low-dimensional systems in the context of chaos synchronization by noise [24], in networks driven by deterministic signals [18], and in systems with discrete-time dynamics [23] In the latter model, the effect of the stochastic input on the transition to chaos is completely captured by its influence on the spectral radius of the Jacobian matrix. We find that the input suppresses chaos significantly more strongly than expected from time-local linear stability, the criterion valid in discrete-time systems This observation is explained by a dynamic effect: The decrease of the maximum Lyapunov exponent is related to the sharpening of the autocorrelation function by the stochastic drive. We find that the memory capacity peaks within the expansive, nonchaotic regime, indicating that locally expansive, while asymptotically stable dynamics is beneficial to store input sequences in the dynamics of the neural network

DYNAMIC MEAN-FIELD EQUATION
EFFECTIVE EQUATION OF MOTION OF THE AUTOCORRELATION FUNCTION
EFFECT OF INPUT ON THE TRANSITION TO CHAOS
STATIC AND DYNAMIC SUPPRESSION OF CHAOS
NONVANISHING MEAN COUPLING AND NON-NEGATIVE TRANSFER FUNCTIONS
Rectified-linear transfer function
Leaky integrate-and-fire neuron transfer function
INFORMATION-PROCESSING CAPABILITIES
VIII. DISCUSSION
General case of colored Gaussian noise
Quenched Gaussian noise
Lyapunov exponent for colored and quenched-noise input
T uðT τÞ
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.