Abstract
Stochastic resonance occurs when noise improves how a nonlinear system performs. This paper presents two general stochastic-resonance theorems for threshold neurons that process noisy Bernoulli input sequences. The performance measure is Shannon mutual information. The theorems show that small amounts of independent additive noise can increase the mutual information of threshold neurons if the neurons detect subthreshold signals. The first theorem shows that this stochastic-resonance effect holds for all finite-variance noise probability density functions that obey a simple mean constraint that the user can control. A corollary shows that this stochastic-resonance effect occurs for the important family of (right-sided) gamma noise. The second theorem shows that this effect holds for all infinite-variance noise types in the broad family of stable distributions. Stable bell curves can model extremely impulsive noise environments. So the second theorem shows that this stochastic-resonance effect is robust against violent fluctuations in the additive noise process.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.