Abstract

The nonnegativity of relative entropy implies that the differential entropy of a random vector X with probability density function (pdf) $f$ is upper bounded by $- {\mathsf {E}}_{}\!\left [{\log g(\text {X})}\right]$ for any arbitrary pdf $g$ . Using this inequality with a cleverly chosen $g$ , we derive a lower bound on the asymptotic excess rate of entropy-constrained vector quantization for $d$ -dimensional sources and $r$ th-power distortion, where the asymptotic excess rate is defined as the difference between the smallest output entropy of a vector quantizer satisfying the distortion constraint and the rate-distortion function in the limit as the distortion tends to zero. Specialized to the one-dimensional case, this lower bound coincides with the asymptotic excess rate achieved by a uniform quantizer, thereby recovering the result by Gish and Pierce that uniform quantizers are asymptotically optimal as the allowed distortion tends to zero. Furthermore, in the one-dimensional case, the derivation of the lower bound reveals a necessary condition for a sequence of quantizers to be asymptotically optimal. This condition implies that any sequence of asymptotically optimal almost-regular quantizers must converge to a uniform quantizer as the distortion tends to zero. While the obtained lower bound itself is not novel, to the best of our knowledge, we present the first rigorous derivation that follows the direct approach by Gish and Pierce without resorting to heuristic high-resolution approximations commonly found in the quantization literature. Furthermore, our derivation holds for all $d$ -dimensional sources having finite differential entropy and whose integer part has finite entropy. In contrast to Gish and Pierce, we do not require additional constraints on the continuity or decay of the source pdf.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call