Abstract

Bucklew's (1984) high-rate vector quantizer mismatch result is extended from fixed-rate coding to variable-rate coding using a Lagrangian formulation. It is shown that if an asymptotically (high-rate) optimal sequence of variable rate codes is designed for a k-dimensional probability density function (PDF) g and then applied to another PDF f for which f/g is bounded, then the resulting mismatch or loss of performance from the optimal possible is given by the relative entropy or Kullback-Leibler (1968) divergence I(f/spl par/g). It is also shown that under the same assumptions, an asymptotically optimal code sequence for g can be converted to an asymptotically optimal code sequence for a mismatched source f by modifying only the lossless component of the code. Applications to quantizer design using uniform and Gaussian densities are described, including a high-rate analog to the Shannon rate-distortion result of Sakrison (1975) and Lapidoth (1997) showing that the Gaussian is the "worst case" for lossy compression of a source with known covariance. By coupling the mismatch result with composite quantizers, the worst case properties of uniform and Gaussian densities are extended to conditionally uniform and Gaussian densities, which provides a Lloyd clustering algorithm for fitting mixtures to general densities.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.