Abstract

Deep hashing has great potential in large-scale visual similarity search due to its preferable efficiency in storage and computation. Technically, deep hashing for visual similarity search inherits the powerful representation capability of deep neural networks, and it encodes visual features into compact binary codes by preserving representative semantic visual features. Works in this field mainly focus on building the relationship between the visual and objective hash spaces, while they seldom study the triadic cross-domain semantic knowledge transfer among visual, semantic, and hashing spaces, leading to a serious semantic ignorance problem during space transformation. In this article, we propose a novel deep tripartite semantically interactive hashing framework, dubbed Semantically Cycle-consistent Hashing Networks (SCHNs), for discriminative hash code learning. Particularly, we construct a flexible semantic space and a transitive latent space, in conjunction with the visual space, to jointly deduce the privileged discriminative hash space. Specifically, a new semantic space is conceived to strengthen the flexibility and completeness of categories in the semantic feature inference phase. At the same time, a transitive latent space is formulated to explore and uncover the shared semantic interactivity embedded in visual and semantic features. Moreover, to further ensure semantic consistency across multiple spaces, we propose to build a cyclic adversarial learning module to preserve and keep their semantic concurrence during space transformation. Notably, our SCHN, for the first time, establishes the cyclic principle of deep semantic-preserving hashing by adaptive semantic parsing across different spaces in a single-modal visual similarity search. In addition, the entire learning framework is jointly optimized in an end-to-end manner. Extensive experiments performed on diverse large-scale datasets evidence the superiority of our method against other state-of-the-art deep hashing algorithms. The source codes of this article are available at https://github.com/JalinWang/SCHN.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call