Abstract

In the phenomenon of stochastic resonance, adding a certain level of nonzero noise to a nonlinear system reduces information loss. A previous study proposed a neural network consisting of thresholding functions that exploit stochastic resonance at run time and during training, with the aim of smooth mapping and backpropagation. Such a neural network can be rephrased as one that operates only when noise is added, i.e., one that is unable to smoothly map and train when noise is absent. Focusing on both explanations simultaneously, a neural network for which only a sub-network is activated selectively by adding noise locally on that sub-network is proposed in this paper. To this end, a new activation function is introduced. It exploits stochastic resonance and presents null output and derivative when no noise is added. Simple simulations confirm that the proposed neural network with the new activation function allows the sub-network to be functionalized selectively, and interpolations are investigated by imposing varying noise intensity on various regions of the network after sub-networks are trained separately.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.