Abstract

Stochastic Pooling Networks (SPN) were recently introduced as a general conceptual framework for modeling surprising nonlinear interactions between redundancy and two forms of dasianoisepsila: lossy compression and randomness. The SPN approach arose from studies of biological signal transduction by populations of sensory neurons, but is also suitable for modeling several modern communications and computing paradigms. The common feature required is that lossy compression and the noise-averaging affects of redundancy occur simultaneously. To illustrate the potential for bio-inspired engineering that mimics neural SPNs, here we illustrate some interesting features of a very simple SPN, where individual network nodes are extremely compressive, and provide only single-bit measurements of analog signals. Information theory is used to quantify the gain obtained from N such measurements. We show that network performance is limited by quantization noise for large input SNRs, but is limited only by the size of the network for small input SNRs. The latter case is shown to approach the performance of a network where there is no lossy compression, indicating that extreme local compression is close to optimal. Finally, interpretation of the mutual information results in terms of both rate-distortion theory, and probability of error are given.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.