Abstract

The problem of distributed data compression for function computation is considered, where: 1) the function to be computed is not necessarily symbolwise function and 2) the information source has memory and may not be stationary nor ergodic. We introduce the class of smooth sources and give a sufficient condition on functions so that the achievable rate region for computing coincides with the Slepian-Wolf region (i.e., the rate region for reproducing the entire source) for any smooth sources. Moreover, for symbolwise functions, the necessary and sufficient condition for the coincidence is established. Our result for the full side-information case is a generalization of the result by Ahlswede and Csiszár to sources with memory; our dichotomy theorem is different from Han and Kobayashi's dichotomy theorem, which reveals an effect of memory in distributed function computation. All results are given not only for fixed-length coding but also for variable-length coding in a unified manner. Furthermore, for the full side-information case, the error probability in the moderate deviation regime is also investigated.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call