Abstract

Few-shot classification, aiming at learning a precise classifier for novel classes with a few annotated support samples, is a challenging task. Current few-shot classification algorithms firstly extract the prototypes from the support images and further leverage the prototypes to classify the query images with diverse elaborately designed matching methods. However, current algorithms all ignore the essentiality of prior knowledge, which would seriously hamper the generalized prototype learning and weaken the classification performance. Meanwhile, the query features obtained from the extractor are indeed the confounders, which would result in the disruption between different class and dramatically limit the classification performance. To tackle these challenges, we propose a novel prior knowledge guided few-shot classification network with class disentanglement. Specifically, the prior knowledge guided module is constructed to boost the original prototypes with more generalized knowledge in a selective combination manner. Meanwhile, class disentanglement module is designed to disentangle the confounder (extracted query features) with the disruption between different classes eliminating. By combing the prior knowledge and disentangling the confounder, our designed network efficiently addresses the few-shot classification. Comprehensive experiments on the miniImageNet and tieredImageNet powerfully demonstrate the superiority of our proposed network.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call