Abstract

Generalized zero-shot learning (GZSL) aims to recognize both seen and unseen classes, while only samples from seen classes are available for training. The mainstream methods mitigate the lack of unseen training data by simulating the visual unseen samples. However, the sample generator is actually learned with just seen-class samples, and semantic descriptions of unseen classes are just provided to the pre-trained sample generator for unseen data generation, therefore, the generator would have bias toward seen categories, and the unseen generation quality, including both precision and diversity, is still the main learning challenge. To this end, we propose a Prototype-Guided Generation for Generalized Zero-Shot Learning (PGZSL), in order to guide the sample generation with unseen knowledge. First, unseen data generation is guided and rectified in PGZSL by contrastive prototypical anchors with both class semantic consistency and feature discriminability. Second, PGZSL introduces Certainty-Driven Mixup for generator to enrich the diversity of generated unseen samples, while suppress the generation of uncertain boundary samples as well. Empirical results over five benchmark datasets show that PGZSL significantly outperforms the SOTA methods in both ZSL and GZSL tasks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call