Abstract

A general framework for modeling surprising nonlinear interactions between redundancy and two forms of 'noise'-lossy compression and randomness is discussed. This 'stochastic pooling network' (SPN) model arose from studies of signal transduction by populations of biological sensory neurons, but is also applicable for several modern communications and computing approaches. SPNs are networks that simultaneously exhibit noise-averaging effects caused by redundancy, and lossy signal compression. Here we illustrate some interesting features of a special case of SPN, where individual network nodes are extremely lossy, and transmit single-bit observations of an analog signal. Mutual information is used to quantify the gain obtained from N such observations, which is shown to be limited by quantization noise for large input SNRs, but only by the size of the network for small input SNRs. We show that this means extreme local compression is close to optimal for small SNRs, by a comparison with the mutual information for the case of no compression. Interpretations of these results in terms of rate-distortion theory and probability of error are given indicating that requantization of the output of an SPN can lead to low bit-error-rate communication.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call