Abstract
The authors use noise to restore chaotic behavior and show consistency with super-Turing theory and operation in neural networks.
Highlights
Noise improves deep learning and is recognized as an important contributor to neural networks (NNs) in artificial intelligence and neuroscience [1,2,3]
(2) We extend and apply theorems from Balcázar et al concerning an infinite hierarchy between P and P/poly [21] to show an infinite hierarchy exists between P and BPP/log*. (3) We investigate how different noise magnitudes optimize chaos mimicking calculations in digital recurrent neural networks (RNNs)
This paper demonstrates that a super-Turing mathematical computational complexity class (BPP/log*) matches the physical reality of recurrent neural networks
Summary
Noise improves deep learning and is recognized as an important contributor to neural networks (NNs) in artificial intelligence and neuroscience [1,2,3]. In [4], the focus was on concepts of analog signals which differ significantly from the digital techniques currently used to develop NNs. The main thrust was that the SR contained in artificial intelligence and neuroscience is inherent in the proof of super-Turing BPP/log* networks and likely contributed to the hardware mimicking chaos. This paper demonstrates how noise aids digital RNNs in attaining super-Turing operation in the realm of the BPP/log* computation class, similar to analog RNNs. We investigate moving limited-precision systems (digital RNNs) from not being chaotic at small amounts of noise, through consistency with chaos, to overwhelming it at large amounts of noise.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have