Abstract

The sketch-based 3D shape retrieval has been an active but challenging task for several decades. In this paper, we deeply analyze the challenges and propose a novel Hierarchical Domain-Augmented Adaptive Learning (HDA2L) for sketch-based 3D shape retrieval. The first notable challenge is the vast cross-modality discrepancies between sketches and 3D shapes. To this issue, the existing methods restrict the consistency of the final features by establishing a shared cross-domain loss but ignore the feature extraction process, resulting in a limited effect. We argue that the mutual information of the samples from the same class but different domains can provide an important cue to enhance common features captured in the feature extraction process. Thus, we design an Inter-Domain Augmented Network (Inter-DAN) by employing inter-domain feature correlation learning to capture cross-domain mutual information to learn augmented common global features for both sketches and 3D shapes. Another challenge is that the input sketch is various: it may be particularly abstract and contains only the overall outline of the target model, or it may be too sketchy and only contains some salient local regions of the target model. Though existing methods have demonstrated their capability to capture overall features, they always ignore the learning of local discriminative features and fail to adapt to the various changes of input sketches. To address this issue, we design an Intra-Domain Augmented Network (Intra-DAN) for sketches and 3D shapes, respectively, which learns augmented local discriminative features by adopting cascading cross-layer bilinear pooling operations. In addition, we design a Source-Agnostic Adversarial Network (SAAN) to accomplish the adaptive hierarchical domain features fusion, which forces the network to adaptively focus on more discriminative information from global features and local features and further adapt the diversity of the input sketches. The experiments on three benchmark datasets demonstrate that our method obtains superior retrieval performance than the state-of-the-art sketch-based 3D shape retrieval approaches.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call