Abstract
The last few years have witnessed the rise of the big data era, in which approximate nearest neighbor search is a fundamental problem in many applications, such as large-scale image retrieval. Recently, many research results demonstrate that hashing can achieve promising performance due to its appealing storage and search efficiency. Since the complex optimization problems for loss functions are difficult to solve, most hashing methods decompose the hash codes learning problem into two steps: projection and quantization. In the quantization step, binary codes are widely used because ranking them by Hamming distance is very efficient. However, the huge information loss produced by the quantization step should be reduced in applications, such as image retrieval where high search accuracy is required. Since many two-step hashing methods produce uneven projected dimensions in the projection step, in this paper, we propose a novel dimension analysis based quantization method (DAQ) on two-step hashing methods for image retrieval. We first perform an importance analysis of the projected dimensions and select a subset of them that are more informative than the others, then we divide the selected projected dimensions into several regions with our quantizer. Every region is quantized with its corresponding codebook. Finally, the similarity between two hash codes is estimated by Manhattan distance between their corresponding codebooks, which is also efficient. We conduct experiments on three public benchmarks containing up to one million descriptors and show that the proposed DAQ method consistently leads to significant accuracy improvements over state-of-the-art quantization methods.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.