Abstract

The aim of source coding is to remove as much redundancy as possible to achieve only the information in a file. This chapter shows that with the requirement of unique decompression, it is possible to achieve the average codeword length arbitrarily close to the entropy, but not less. A simple algorithm is given to construct a code, the Huffman code, that is optimal in terms of codeword length for a source with independent and identically distributed symbols. With Kraft inequality, a mathematical foundation needed to consider an optimization function for the codeword lengths is obtained. One standard method for minimization of a function with some side criterion is the Lagrange multiplication method. The idea of arithmetic coding is to consider the probabilities for vectors and to build a code with codeword length that grows as the logarithm of the probability for the vector, but without the exponential growth in complexity.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call