Abstract

We provide a novel upper-bound on Witsenhausen's rate, the rate required in the zero-error analogue of the Slepian-Wolf problem. Our bound is given in terms of a new information-theoretic functional defined on a certain graph and is derived by upper bounding complementary graph entropy. We use the functional, along with graph entropy, to give a single letter lower-bound on the error exponent for the Slepian-Wolf problem under the vanishing error probability criterion, where the decoder has full (i.e., unencoded) side information. We demonstrate that our error exponent can beat the “expurgated” source-coding exponent of Csiszar and Korner for some sources that have zeroes in the “channel” matrix connecting the source with the side information. An extension of our scheme to the lossy case (i.e., Wyner-Ziv) is given. For the case in which the side information is a deterministic function of the source, the exponent of our improved scheme agrees with the sphere-packing bound exactly (thus determining the reliability function). An application of our functional to zero-error channel capacity is also given.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call