Abstract
Supervised hashing has achieved better accuracy than unsupervised hashing in many practical applications owing to its use of semantic label information. However, the mutual relationship between semantic labels is always ignored when leveraging label information. In addition, the major challenge in learning hash is of handling the discrete constraints imposed on the hash codes, which typically transform the hash optimization into NP-hard problems. To address these issues, a form of supervised discrete hashing through learning mutual similarities is proposed. Different from the existing supervised hashing methods that learn hash codes from least-squares classification by regressing the hash codes to their corresponding labels, we leverage the mutual relation between different semantic labels to learn more stable hash codes. In addition, the proposed method can simultaneously learn the discrete hash codes for training samples and the projections between the original features and their corresponding hash codes for the out-of-sample cases. Experiments have beeen performed on two public datasets. The experimental results demonstrate the superiority of the proposed method.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.