Abstract

Cross-modal hash has become a key technology for large datasets retrieval. However, some challenges still need to be tackled: 1) How to effectively embed semantic information into hash learning; 2) How to strengthen cross-modal mutual communication and common learning; 3) How to effectively solve data extension with semantics. In this paper, we propose a novel Extensible Multi-similarity Cross-modal Hash Learning (EMCHL) method. To reduce the differentiation of inter-modal and strengthen cross-modal mutual communication, EMCHL mainly considers three similarity measures: 1) Using Hamming distance to measure the semantic structure self-similarity of label and sample hash codes in unimodal, it can reduce the sample quantization error; 2) the log-likelihood function is adopted to enhance information exchange in cross-modal hash codes. Correspondence cross-modal reciprocal-similarity measurement can reduce the inter-modal quantization code errors; 3) The robust low-rank constraint of the dot product is introduced to perform the cross-modal sparse latent-similarity calculation. It can establish the connection between the common latent semantic space and the hash codes. Besides, we also propose an Interactive Projection Matrix Learning (IPML) extensible mechanism that makes it easier to be applied to large-scale retrieval datasets. Extensive experiment results show the superior performance of EMCHL compared to state-of-the-art cross-modal hash methods. Our method code is released at: https://github.com/Tjeep-Tan/EMCHL.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.