Abstract

Supervised hashing has achieved better accuracy than unsupervised hashing in many practical applications owing to its use of semantic label information. However, the mutual relationship between semantic labels is always ignored when leveraging label information. In addition, the major challenge in learning hash is of handling the discrete constraints imposed on the hash codes, which typically transform the hash optimization into NP-hard problems. To address these issues, a form of supervised discrete hashing through learning mutual similarities is proposed. Different from the existing supervised hashing methods that learn hash codes from least-squares classification by regressing the hash codes to their corresponding labels, we leverage the mutual relation between different semantic labels to learn more stable hash codes. In addition, the proposed method can simultaneously learn the discrete hash codes for training samples and the projections between the original features and their corresponding hash codes for the out-of-sample cases. Experiments have beeen performed on two public datasets. The experimental results demonstrate the superiority of the proposed method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call