Abstract

The author considers the problem of source coding and investigates the cases of known and unknown statistics. The efficiency of the compression codes can be estimated by three characteristics: 1) the redundancy (r), defined as the maximal difference between the average codeword length and Shannon entropy in case the letters are generated by a Bernoulli source; 2) the size (in bits) of the encoder and the encoder programs (S) when implemented on a computer; and 3) the average time required for encoding and decoding of a single letter (T). He investigates S and T as a function of r when r/spl rarr/0. All known methods may be divided into two classes. The Ziv-Lempel codes and their variants fall under the first class, and the arithmetic code with the Lynch-Davisson code fall under the second one. The codes from the first class need exponential memory size S=0(exp(1/r)) for redundancy r when r/spl rarr/O. The methods from the second class have a small memory size but a low encoding speed: S=0(1/r/sup const/), T=0(log/sup const/(1/r) log log (1/r)). In this paper, the author presents a code that combines the merits of both classes; the memory size is small and the speed is high: S=0(1/r/sup const/),T=0(log/sup const/(1/r) log log (1/r)).< <ETX xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">&gt;</ETX>

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call