Abstract

This paper describe a random coding model for universal quantization. The universal quantizer consists of a (typically) mismatched random codebook followed by optimal entropy-coding. We precisely characterize the rate gain due to entropy-coding and show that it may be arbitrarily large. In the special case of entropy-coded i.i.d. Gaussian codebooks with large variance, we draw a novel connection with the compression performance of entropy-coded dithered lattice quantization. Our main tools are large deviations techniques that allow us to prove an almost sure version of the conditional limit theorem.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call