Abstract

For approximate nearest neighbor (ANN) search in many vision-based applications, vector quantization (VQ) is an efficient compact encoding technology. A representative approach of VQ is product quantization (PQ) which quantizes subspaces separately by Cartesian product and achieves high accuracy. But its space decomposition still leads to quantization distortion. This paper presents two optimized solutions based on residual vector quantization (RVQ). Different from PQ, RVQ simulates restoring quantization error by multi-stage quantizers instead of decomposing it. To further optimize codebook and space decomposition, we try to get a better discriminated space projection. Then an orthonormal matrix R is generated. The RVQ's nonparametric solution alternately optimizes R and stage-codebooks by Singular Value Decomposition (SVD) in multiple iterations. The RVQ's parametric solution assumes that data are subject to Gaussian distribution and uses Eigenvalue Allocation to get each stage-matrix {Rl}(1≤l≤L) at once, where L is the stage number of RVQ. Compared to various optimized PQ-based methods, our methods have good superiority on restoring quantization error.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call