Abstract

Hashing methods have attracted great attention for cross-modal retrieval due to the low memory requirement and fast computation. Cross-modal hashing methods aim to transform the data from different modalities into a common Hamming space. However, most existing cross-modal hashing methods ignore the restrictions on the Hamming distance between dissimilar instances. Besides, most cross-modal hashing methods relax discrete constraints and then quantize the continuous values to obtain suboptimal solutions as hash codes, which causes quantization error and low retrieval performance. To address above problems, we propose a novel supervised cross-modal hashing method, termed Discrete Similarity Preserving Hashing (DSPH). DSPH simultaneously preserves inter-modality and intra-modality similarity. Specifically, DSPH puts restrictions on both the similar and dissimilar instances to learn more discriminative hash codes. Moreover, we present a discrete gradient descent algorithm to solve the discrete optimization problem. Extensive experiments conducted on Wiki and NUS-WIDE datasets show that DSPH improves retrieval performance compared with several state-of-the-art cross-modal hashing methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call