Abstract

Neural circuits operate with delays over a range of time scales, from a few milliseconds in recurrent local circuitry to tens of milliseconds or more for communication between populations. Modeling usually incorporates single fixed delays, meant to represent the mean conduction delay between neurons making up the circuit. We explore conditions under which the inclusion of more delays in a high-dimensional chaotic neural network leads to a reduction in dynamical complexity, a phenomenon recently described as multi-delay complexity collapse (CC) in delay-differential equations with one to three variables. We consider a recurrent local network of 80% excitatory and 20% inhibitory rate model neurons with 10% connection probability. An increase in the width of the distribution of local delays, even to unrealistically large values, does not cause CC, nor does adding more local delays. Interestingly, multiple small local delays can cause CC provided there is a moderate global delayed inhibitory feedback and random initial conditions. CC then occurs through the settling of transient chaos onto a limit cycle. In this regime, there is a form of noise-induced order in which the mean activity variance decreases as the noise increases and disrupts the synchrony. Another novel form of CC is seen where global delayed feedback causes “dropouts,” i.e., epochs of low firing rate network synchrony. Their alternation with epochs of higher firing rate asynchrony closely follows Poisson statistics. Such dropouts are promoted by larger global feedback strength and delay. Finally, periodic driving of the chaotic regime with global feedback can cause CC; the extinction of chaos can outlast the forcing, sometimes permanently. Our results suggest a wealth of phenomena that remain to be discovered in networks with clusters of delays.

Highlights

  • Biological neural networks can involve delays below the millisecond time scale up to several tens of milliseconds (Madadi Asl et al, 2018)

  • When the time delay between neurons in the local recurrent circuitry is increased to 10 ms, chaotic dynamics can no longer be seen, and harmonics appear in the power spectrum at integer multiples of 25.6 Hz

  • We have focused on the properties of a rate-based neural network with a small number of short delays in the local sparsely connected EI recurrent circuitry, and how this is altered by a longer delay that acts globally through all-to-all feedback inhibition

Read more

Summary

Introduction

Biological neural networks can involve delays below the millisecond time scale up to several tens of milliseconds (Madadi Asl et al, 2018). Local circuitry involves delays, which are often neglected in modeling studies due to the added dynamical complexity they bring to the problem. They have been shown to promote oscillations (Belair et al, 1996; Brunel and Hakim, 1999; Bimbard et al, 2016), and play important roles in Complexity Collapse in Neural Networks synchronization phenomenon (Coombes and Laing, 2009) and learning phenomenon (Gerstner et al, 1996). They are omnipresent in large scale neural control systems where they can reach many hundreds of milliseconds, e.g., in reflex arcs (Longtin et al, 1990)

Objectives
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.