Abstract

This correspondence analyzes the low-resolution performance of entropy-constrained scalar quantization. It focuses mostly on Gaussian sources, for which it is shown that for both binary quantizers and infinite-level uniform threshold quantizers, as D approaches the source variance /spl sigma//sup 2/, the least entropy of such quantizers with mean-squared error D or less approaches zero with slope -log/sub 2/e/2/spl sigma//sup 2/. As the Shannon rate-distortion function approaches zero with the same slope, this shows that in the low-resolution region, scalar quantization with entropy coding is asymptotically as good as any coding technique.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call