Abstract
This paper presents a reduced-complexity deterministic annealing (DA) approach for vector quantizer (VQ) design by using soft information processing with simplified assignment measures. Low-complexity distributions are designed to mimic the Gibbs distribution, where the latter is the optimal distribution used in the standard DA method. These low-complexity distributions are simple enough to facilitate fast computation, but at the same time they can closely approximate the Gibbs distribution to result in near-optimal performance. We have also derived the theoretical performance loss at a given system entropy due to using the simple soft measures instead of the optimal Gibbs measure. We use the derived result to obtain optimal annealing schedules for the simple soft measures that approximate the annealing schedule for the optimal Gibbs distribution. The proposed reduced-complexity DA algorithms have significantly improved the quality of the final codebooks compared to the generalized Lloyd algorithm and standard stochastic relaxation techniques, both with and without the pairwise nearest neighbor (PNN) codebook initialization. The proposed algorithms are able to evade the local minima and the results show that they are not sensitive to the choice of the initial codebook. Compared to the standard DA approach, the reduced-complexity DA algorithms can operate over 100 times faster with negligible performance difference. For example, for the design of a 16-dimensional vector quantizer having a rate of 0.4375 bit/sample for Gaussian source, the standard DA algorithm achieved 3.60 dB performance in 16 483 CPU seconds, whereas the reduced-complexity DA algorithm achieved the same performance in 136 CPU seconds. Other than VQ design, the DA techniques are applicable to problems such as classification, clustering, and resource allocation.
Highlights
Vector quantization is a source coding technique that approximates blocks of input data by one of a finite number of prestored vectors in a codebook
We divided 16 384 samples into 1024 16-dimensional training vectors, and designed codebooks of sizes 32, 64, 128, and 256 for both training sets, where the initial codebooks were obtained randomly from the training sets. Since both generalized Lloyd algorithm (GLA) and stochastic relaxation (SR)-D are sensitive to the choice of the initial codebooks, in order to investigate the effect of initialization, we have designed codebooks of sizes 32, 64, 128, and 256, where the pairwise nearest neighbor (PNN) algorithm [33] is used to obtain the initial codebooks
The proposed low-complexity soft measures are used as the soft association probabilities in the probabilistic framework of the deterministic annealing (DA) to reduce the computational cost compared to the optimal Gibbs soft measure used in the standard DA
Summary
Vector quantization is a source coding technique that approximates blocks (or vectors) of input data by one of a finite number of prestored vectors in a codebook. Random search moves were allowed on the energy surface in order to give the system the ability to avoid local minima Unlike these SR techniques, a deterministic annealing (DA) approach for optimal vector quantizer design puts the problem in a probabilistic framework, and deterministically optimizes the probabilistic objective function in each iteration [13]. In contrast to the standard DA which starts with essentially a single codevector and increases the size of the codebook through iterations, in SVQ the design starts with the required number of codevectors and optimizes their locations through iterations It is observed, and empirically shown, that the importance of a codevector for a given sample vector (in terms of the amount of probability mass associated with it) decreases exponentially fast with the distance from the sample vector, even at relatively high temperatures.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.