Abstract

Stochastic Resonance (SR) and Coherence Resonance (CR) are non-linear phenomena, in which an optimal amount of noise maximizes an objective function, such as the sensitivity for weak signals in SR, or the coherence of stochastic oscillations in CR. Here, we demonstrate a related phenomenon, which we call “Recurrence Resonance” (RR): noise can also improve the information flux in recurrent neural networks. In particular, we show for the case of three-neuron motifs with ternary connection strengths that the mutual information between successive network states can be maximized by adding a suitable amount of noise to the neuron inputs. This striking result suggests that noise in the brain may not be a problem that needs to be suppressed, but indeed a resource that is dynamically regulated in order to optimize information processing.

Highlights

  • Recurrent neural networks (RNN) with apparently random connections occur ubiquitously in the brain (Middleton and Strick, 2000; Song et al, 2005)

  • For each of the six motifs, we investigate the statistical and information theoretical properties as functions of the noise level

  • As the noise level is increased, all state probabilities start to asymptotically approach the uniform value of we find a monotonous increase of the state entropy toward the maximum value of 3 bit, which in principle should favor the information flux in the motif and should help to increase the mutual information (MI) between successive states

Read more

Summary

Introduction

Recurrent neural networks (RNN) with apparently random connections occur ubiquitously in the brain (Middleton and Strick, 2000; Song et al, 2005) They can be viewed as complex non-linear systems, capable of ongoing activity even in the absence of driving inputs, and they show rich dynamics, including oscillatory, chaotic, and stationary fixed point behavior (Krauss et al, 2019). A typical RNN will not conserve the input information in its original form, but transform it to new and possibly more useful representations at each time step This ability of RNNs to dynamically store and continuously re-code information, as well as the possibility to combine the circulating information with new inputs, is essential for the processing of sequential data (Skowronski and Harris, 2007)

Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.