Abstract

We consider the problem of lossy linear function computation for Gaussian sources in a tree network. The goal is to find the optimal tradeoff between the sum rate (the overall number of bits communicated in the network) and the achieved distortion (the overall mean-square error of estimating the function result) at a specified sink node. Using random Gaussian codebooks, an inner bound is obtained that is shown to match the information-theoretic outer bound (obtained in our earlier work [1]) in the limit of zero distortion. To compute the overall distortion for the random coding scheme, we applied the analysis of Distortion Accumulation which was quantified in [1] for MMSE estimates of intermediate computation variables instead of for the codewords of random Gaussian codebooks. The key in applying the analysis of Distortion Accumulation is showing that the random-coding based codeword on the receiver side is close in mean-square sense to the MMSE estimate of the source, even if the knowledge of the source distribution is not fully accurate.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call