Abstract

Let X and Y denote two jointly memoryless sources with finite alphabets. Suppose that X is to be encoded in a lossy manner with Y as the side information available only at the decoder. A common approach to this lossy source coding problem is to apply conventional vector quantization followed by Slepian-Wolf coding. In this paper we are interested in the rate-distortion performance achievable asymptotically by this approach. Given an arbitrary single letter distortion measure d, it is shown that the best rate achievable asymptotically under the constraint that X is recovered with distortion level no greater than D ges 0 is Rwz(D) = mintimes[I(X;X) -I(Y; X)], where the minimum is taken over all auxiliary random variables X such that Ed(X, X) les D and X rarr X rarr Y is a Markov chain. An extended Blahut-Arimoto algorithm is then proposed to calculate Rwz(D) for any (X,Y) and any distortion measure, and the convergence of the algorithm is also proved. Interestingly, it is observed that the random variable X achieving Rwz(D) is, in general, different from the random variable X' achieving the classical rate-distortion function R(D) of X at distortion D. In particular, it is shown that in the case of binary sources and Hamming distortion measure, the random variable X achieving Rwz (D) is the same as the random variable X' achieving R(D) if and only if the channel pYItimes from X to Y is symmetric. Thus, the design of conventional quantization in the case of side information at the decoder should be different from the case of no side information.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call