Abstract

The distributed remote source coding (so-called CEO) problem is studied in the case where the underlying source, not necessarily Gaussian, has finite differential entropy and the observation noise is Gaussian. The main result is a new lower bound for the sum-rate-distortion function under arbitrary distortion measures. When specialized to the case of mean-squared error, it is shown that the bound exactly mirrors a corresponding upper bound, except that the upper bound has the source power (variance) whereas the lower bound has the source entropy power. Bounds exhibiting this pleasing duality of power and entropy power have been well known for direct and centralized source coding since Shannon's work. While the bounds hold generally, their value is most pronounced when interpreted as a function of the number of agents in the CEO problem.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call