Abstract

The brain consists of many interconnected networks with time-varying, partially autonomous activity. There are multiple sources of noise and variation yet activity has to eventually converge to a stable, reproducible state (or sequence of states) for its computations to make sense. We approached this problem from a control-theory perspective by applying contraction analysis to recurrent neural networks. This allowed us to find mechanisms for achieving stability in multiple connected networks with biologically realistic dynamics, including synaptic plasticity and time-varying inputs. These mechanisms included inhibitory Hebbian plasticity, excitatory anti-Hebbian plasticity, synaptic sparsity and excitatory-inhibitory balance. Our findings shed light on how stable computations might be achieved despite biological complexity. Crucially, our analysis is not limited to analyzing the stability of fixed geometric objects in state space (e.g points, lines, planes), but rather the stability of state trajectories which may be complex and time-varying.

Highlights

  • Behavior emerges from complex neural dynamics unfolding over time in multi-area brain networks

  • By using the Jacobian analysis outlined above, we found that inhibitory Hebbian synaptic plasticity leads to stable dynamics in neural circuits

  • While there are several distinctions between the networks described above and Echo State Networks (ESNs) (e.g. ESNs are typically discrete time dynamical systems, rather than continuous), we show in the appendix (S1A Text Section 5.1) that they are a special case of the networks considered here

Read more

Summary

Introduction

Behavior emerges from complex neural dynamics unfolding over time in multi-area brain networks. Even in tightly controlled experimental settings, these neural dynamics often vary between identical trials [1,2]. This can be due to a variety of factors including variability in membrane potentials, inputs, plastic changes due to recent experience and so on. In spite of these fluctuations, brain networks must achieve computational stability: despite being “knocked around” by plasticity and noise, the behavioral output of the brain on two experimentally identical trials needs to be similar. There has been a number of recent studies—both computational and experimental—which focus more broadly on the stability of neural trajectories [12,13], which may be complex and time-varying

Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call