Abstract

A problem of interactive function computation in a collocated network is studied in a distributed block source coding framework. With the goal of computing samples of a desired function of sources at the sink, the source nodes exchange messages through a sequence of error-free broadcasts. For any function of independent sources, a computable characterization of the set of all feasible message coding rates—the rate region—is derived in terms of single-letter information measures. In the limit as the number of messages tends to infinity, the infinite-message minimum sum rate, viewed as a functional of the joint source probability mass function, is characterized as the least element of a partially ordered family of functionals having certain convex-geometric properties. This characterization leads to a family of lower bounds for the infinite-message minimum sum rate and a simple criterion to test the optimality of any achievable infinite-message sum rate. An iterative algorithm for evaluating the infinite-message minimum sum-rate functional is proposed and is demonstrated through an example of computing the minimum function of three Bernoulli sources. Based on the characterizations of the rate regions, it is shown that when computing symmetric functions of binary sources, the sink will inevitably learn certain additional information that is not demanded in computing the function. This conceptual understanding leads to new improved bounds for the minimum sum rate. The new bounds are shown to be orderwise better than those based on cut-sets as the network scales. The scaling law of the minimum sum rate is explored for different classes of symmetric functions and source parameters.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call