Abstract

Learning-based hash has been widely used for large-scale similarity retrieval due to the efficient computation and condensed storage of binary representations. In this paper, we propose AdaLabelHash, a hash function learning approach via neural networks. In AdaLabelHash, class label representations are adaptable during the network training. We express the labels as hypercube vertices in a K-dimensional space, and both the network weights and class label representations are updated in the learning process. As the label representations are explored from data, semantically similar categories will be assigned with the label representations that are close to each other in terms of Hamming distance in the label space. The label representations then serve as the desired output of the hash function learning so as to yield compact and discriminating binary hash codes via the network. AdaLabelHash is simple but effective, which can jointly learn label representations and infer compact binary codes from data. It is applicable to both supervised and semi-supervised learning of hash codes. Experimental results on standard benchmarks show the effectiveness of AdaLabelHash.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.