Stochastic resonance occurs when noise improves how a nonlinear system performs. This paper presents two general stochastic-resonance theorems for threshold neurons that process noisy Bernoulli input sequences. The performance measure is Shannon mutual information. The theorems show that small amounts of independent additive noise can increase the mutual information of threshold neurons if the neurons detect subthreshold signals. The first theorem shows that this stochastic-resonance effect holds for all finite-variance noise probability density functions that obey a simple mean constraint that the user can control. A corollary shows that this stochastic-resonance effect occurs for the important family of (right-sided) gamma noise. The second theorem shows that this effect holds for all infinite-variance noise types in the broad family of stable distributions. Stable bell curves can model extremely impulsive noise environments. So the second theorem shows that this stochastic-resonance effect is robust against violent fluctuations in the additive noise process.
Read full abstract