Abstract

The batch-based hash learning paradigm for uni-modal or cross-modal retrieval has made great progress in recent decades. However, methods that are based on this paradigm cannot adapt to the scenario of incoming new data streams continuously; in addition, they are inefficient in terms of training time and memory cost in big data searches because they have to accumulate all the database data and new incoming data before training. Although a few online hash retrieval methods have been proposed to address these issues in recent years, they are based on shallow models, and none of them can perform uni-modal and cross-modal retrieval in one framework. To this end, we propose a novel method, namely, Online Deep Hashing for both Uni-modal and Cross-modal retrieval (ODHUC). For online deep hashing, ODHUC first trains image and text neural networks with image and text fundamental databases, and their hash codes are learned. When new data arrive, ODHUC samples images and texts from the newly arriving data and fundamental databases to update the image and text networks and learn the hash codes of the new data, the aim of which is to keep the similarity of the hash codes of the incremental and old data consistent with the supervised semantic similarity. ODHUC is also trained with sampled old data by knowledge distillation to learn new knowledge and avoid catastrophic forgetting of old knowledge. In this way, ODHUC achieves online deep hash learning continually without performance decline. For two categories of retrieval tasks, ODHUC learns image and text deep hash functions in two stages to align image and text features; it generates image hash codes first and then utilizes them to supervise the text hash code learning. Thus, ODHUC can perform both uni-modal and cross-modal retrieval tasks in one framework. Extensive experiments on three real-world datasets demonstrate that ODHUC outperforms several state-of-the-art methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.