It is known that the expected codeword length L_{UD} of the best uniquely decodable (UD) code satisfies H(X)\leqL_{UD} . Let X be a random variable which can take on n values. Then it is shown that the average codeword length L_{1:1} for the best one-to-one (not necessarily uniquely decodable) code for X is shorter than the average codeword length L_{UD} for the best uniquely decodable code by no more than (\log_{2} \log_{2} n)+ 3 . Let Y be a random variable taking on a finite or countable number of values and having entropy H . Then it is proved that L_{1:1}\geq H-\log_{2}(H + 1)-\log_{2}\log_{2}(H + 1 )-\cdots -6 . Some relations are established among the Kolmogorov, Chaitin, and extension complexities. Finally it is shown that, for all computable probability distributions, the universal prefix codes associated with the conditional Chaitin complexity have expected codeword length within a constant of the Shannon entropy.
Read full abstract