Abstract
Matrix factorization has been utilized for the task of cross-view hashing, where basis functions are learned to map data from different views to the same hamming embedding. It is possible that the basis functions between the hamming embedding and the original data matrix contain rather complex hierarchical information, which existing work can not capture. In addition, previous work employs relaxation technique in the matrix factorization based hashing which may lead to large quantization error. To address these issues, this paper presents a novel Supervised Discrete Deep Matrix Factorization (SDDMF) for cross-view hashing. We introduce deep matrix factorization so that SDDMF is able to learn a set of hierarchical basis functions and unified binary codes from different views. In addition, a classification error term is incorporated into the objective to learn discriminative binary codes. We then employ a linearization technique to directly optimize the discrete constraints which can significantly reduce the quantization error. Experimental results on three standard datasets with image-text modalities verify that SDDMF significantly outperforms several state-of-the-art methods.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.