Abstract

This paper proposes a generalized stochastic mechanism for codebook generation in lossy coding settings for sources with memory. Earlier work has shown that the rate-distortion bound can be asymptotically achieved for discrete memoryless sources by a “natural type selection” (NTS) algorithm. In iteration n, the distribution that is most likely to produce the types of a sequence of K codewords of finite length I that “dmatch” a respective sequence of K source words of length I, (i.e., which satisfy the distortion constraint), is used to regenerate the codebook for iteration n+1. The resulting sequence of codebook generating distributions converges to the optimal distribution Q* that achieves the rate-distortion bound for the memoryless source, asymptotically in I, K, and n. This work generalizes the NTS algorithm to account for sources with memory. The algorithm encodes mI-length source words consisting of I vectors (or super-symbols) of length m. We show that for finite m and I, the sequence of codebook reproduction distributions Q <sub xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">0,m,l</sub> , Q <sub xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">1,m,l,</sub> ... (each computed after observing a sequence of K d-match events) converges to the optimal achievable distribution Q* <sub xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">m,l</sub> (within a set of achievable distributions determined by m and I), asymptotically in K and n. It is further shown that Q* <sub xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">m,l</sub> converges to the optimal reproduction distribution Q* that achieves the rate-distortion bound for sources with memory, asymptotically in m and I.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call