Abstract

We exhibit a class of smooth continuous-state neural-inspired networks composed of simple nonlinear elements that can be made to function as a finite-state computational machine. We give an explicit construction of arbitrary finite-state virtual machines in the spatiotemporal dynamics of the network. The dynamics of the functional network can be completely characterized as a "noisy network attractor" in phase space operating in either an "excitable" or a "free-running" regime, respectively, corresponding to excitable or heteroclinic connections between states. The regime depends on the sign of an "excitability parameter." Viewing the network as a nonlinear stochastic differential equation where a deterministic (signal) and/or a stochastic (noise) input is applied to any element, we explore the influence of the signal-to-noise ratio on the error rate of the computations. The free-running regime is extremely sensitive to inputs: arbitrarily small amplitude perturbations can be used to perform computations with the system as long as the input dominates the noise. We find a counter-intuitive regime where increasing noise amplitude can lead to more, rather than less, accurate computation. We suggest that noisy network attractors will be useful for understanding neural networks that reliably and sensitively perform finite-state computations in a noisy environment.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.