Abstract

In this paper we investigate universal data compression with side information at the decoder by leveraging traditional universal data compression algorithms. Specifically, consider a source network with feedback in which a finite alphabet source X = {Xi}i=0 infin is to be encoded and transmitted, and another finite alphabet source Y = {Yi}i=0 infin available only to the decoder as the side information correlated with X. Assuming that the encoder and decoder share a uniform i.i.d. (independent and identically distributed) random database that is independent of (X, Y), we propose a string matching-based (variable-rate) block coding algorithm with a simple progressive encoder for the feedback source network. Instead of using standard joint typicality decoding, this algorithm derives its decoding rule from the codeword length function of a traditional universal lossless coding algorithm. As a result, neither the encoder nor the decoder assumes any prior knowledge of the joint distribution of (X, Y) or even the achievable rates. It is proven that for any (X, Y) in the class of all stationary, ergodic source-side information pairs with finite alphabet, the average number of bits per letter transmitted from the encoder to the decoder (compression rate) goes arbitrarily close to the conditional entropy rate H(X|Y) of X given Y asymptotically, and the average number of bits per letter transmitted from the decoder to the encoder (feedback rate) goes to 0 asymptotically.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call