Abstract
Hashing has received broad attention in large-scale image retrieval due to its appealing efficiency in computation and storage. Particularly, with the drawn of deep learning, much efforts have been directed towards using deep neural networks to learn feature representations and hash codes simultaneously, and the developed deep hashing methods have shown superior performance over conventional hashing methods. In this paper, we propose Deep Attention Sampling Hashing (DASH), a novel deep hashing method that yields high-quality hash codes to enable efficient image retrieval. Specifically, we employ two sub-networks in DASH, i.e., a master branch and a part branch, to capture global structure features and discriminative feature representations, respectively. Furthermore, we develop an Attention Sampler Module (ASM), which consists of an Object Region Extraction (ORE) block and an Informative Patch Generation (IPG) block, to yield richer informative image patches. The ORE block provides a well-designed multi-scale attentional fusion mechanism to highlight and extract the significant regions of images, and the IPG block employs a direction-specific shift mechanism to generate desired image patches with discriminative details. Both blocks could be seamlessly integrated into various convolutional neural network (CNN) architectures. Subsequently, we conduct knowledge distillation optimization to transfer the details learned by the part branch into the master branch to guide hash code learning. In addition, we design a Weibull quantization loss to minimize the information loss caused by binary quantization. The experimental results on three benchmark datasets demonstrate the effectiveness of the proposed DASH with respect to different evaluation metrics.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.