Abstract

Unsupervised hashing has been extensively applied in large-scale multi-modal retrieval by mapping original data from heterogeneous modalities into unified binary codes. However, there still remain challenges especially how to balance the individual modality-specific representations and common representation preserving intrinsic linkages among heterogeneous modalities. In this paper, we propose a novel fast Unsupervised Multi-modal Hashing based on Piecewise Learning, denoted as UMHPL, to deal with the mentioned issue. Initially, we formulate the problem as matrix factorization to derive the individual modality-specific latent representations and common latent representation with consensus matrices in a brief time. To maintain the integrality of multi-modal data, we integrate them by adaptive weight factors and nuclear norm minimization. Subsequently, we establish a connection between the individual modality-specific latent representations and common latent representation based on the piecewise hash learning framework to reinforce the discriminative competency of model, which leads the hash codes more compact. Finally, an effective discrete optimization algorithm in mathematical logic and functional analysis is proposed. Comprehensive experiments on Wiki, MIRFlirck, NUS-WIDE, and MSCOCO datasets demonstrate the superior performance of UMHPL to state-of-the-art hashing methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.