Abstract

In information spectrum methods [4], the performance of a fixed-to-fixed length (FF) source code is measured by the limit superior of the coding rate and the error probability of the code. Since there is a trade-off between the coding rate and the error probability, the optimal coding rate is defined as the infimum of the limit superior of the coding rate subject to the condition that the error probability converges to zero. It is proved that this optimal coding rate coincides with the spectral sup-entropy rate H(X) [4]. Under this criterion, however, the coding rate does not converge to a constant in general. On the other hand, let us consider the situation where we construct an FF code with vanishing error probability for a stationary and memoryless source. Since almost all the typical sequences should be correctly decoded in such a code [3, Theorem 3.3.1], its coding rate is asymptotically lower-bounded by H(X) − γ, where H(X) is the entropy of the source and γ > 0 is an arbitrarily small constant. Hence, if we require that the FF code is optimal, i.e., its coding rate is upper-bounded by H(X) + γ for sufficiently large blocklength, the coding rate is almost equal to the entropy. That is, the coding rate of any optimal code is arbitrarily close to the entropy. The objective of this paper is clarifying the condition under which the coding rate of any optimal code asymptotically attaining H(X) converge to a constant that is independent of the optimal code. To this end, we define a

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call