Abstract

With the explosive growth of visual media categories, zero-shot learning (ZSL) aims to transfer the knowledge obtained from the seen classes to the unseen classes for recognizing the novel instances. However, there is a domain gap between the seen and the unseen classes, and simply matching the unseen instances using nearest neighbor searching in the embedding space cannot bridge this gap effectively. In this paper, we propose a Holistically-Associated Model to overcome this obstacle. In particular, the proposed model is designed to combat two fundamental problems of ZSL, the representation learning and label assignment of the unseen classes. The first problem is addressed by proposing an affinity propagation network, which considers holistic pairwise connections of all classes for producing exemplar features of the unseen samples. We cope with the second issue by proposing a progressive clustering module. It iteratively refines unseen clusters so that holistic unseen instance features can be used for a reliable class-wise label assignment. Thanks to the precise exemplar features and class-wise label assignment, our model eliminates the domain gap effectively. We extensively evaluate the proposed model on five human action and image datasets, i.e., Olympics Sports, HMDB51, UCF101, AWA2, and SUN. Experimental results show that the proposed model outperforms state-of-the-art methods on these substantially different datasets.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.