Abstract

Few-shot image classification commits to recognizing new concepts from limited annotated samples. Our insight is to obtain a sufficiently powerful embedding network (PEN) to solve few-shot classification tasks. We propose a method to tackle the few-shot classification tasks, namely PENs for few-shot image classification. The key core of PEN is gaining a well-trained embedding network that is capable of extracting strong discriminating representations to represent an image by utilizing two strategies. One strategy is that the multi-scale feature maps are fused instead of only utilizing the final top-level feature maps. We consider that low-level features also play an important role instead of only utilizing top-level representations. Another significant strategy is knowledge distillation (KD). The characteristics of KD can help us get better performance of an embedding network to extract features. Finally, a distance function is employed to classify unlabeled samples. Comprehensive experiments are conducted on few-shot benchmarks. Our method achieves promising performances. The results demonstrate that KD and future fusion are beneficial to gain an expected embedding network for few-shot classification tasks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call