Abstract

Hashing can facilitate efficient retrieval and storage for large-scale images due to the binary representation. In the real applications, the trade-off between retrieval accuracy and speed is essential for designing a hashing framework, which is reflected by variable hash code lengths. In light of this, the existing hashing methods need to train different models for different lengths of hash codes, leading to considerable training time cost and hashing flexibility reduction. Given that a sample can be represented by various hash codes with different lengths, there are some helpful relationships that can boost the performance of hashing methods. However, the existing hashing methods do not fully utilize these relationships. To address the aforementioned issues, we propose a new model, known as supervised discrete multiple-length hashing (SDMLH), to simultaneously learn hash codes with multiple lengths. In this proposed SDMLH method, three types of information are respectively derived, from the hash codes with different lengths. The original features of the samples, and the label, are applied for hash learning. Unlike the existing hashing methods, SDMLH can fully employ the assistance among hash codes with different lengths and learn them in one step. Furthermore, given a hash length meeting the demand of users, we propose a hash fusion strategy to obtain the hash code with this desirable length by fusing the multiple-length hash codes. This obtained hash code outperforms the one learned directly. In addition, SDMLH can generate the hash code of any length that is shorter than the sum length of given multiple hash codes with the fusion strategy. To the best of our knowledge, SDMLH is one of the first attempts for learning multiple-length hash codes simultaneously. We conduct extensive experiments based on three benchmark datasets, demonstrating the superiority of this proposed method.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.