Abstract
Few-shot image classification commits to recognizing new concepts from limited annotated samples. Our insight is to obtain a sufficiently powerful embedding network (PEN) to solve few-shot classification tasks. We propose a method to tackle the few-shot classification tasks, namely PENs for few-shot image classification. The key core of PEN is gaining a well-trained embedding network that is capable of extracting strong discriminating representations to represent an image by utilizing two strategies. One strategy is that the multi-scale feature maps are fused instead of only utilizing the final top-level feature maps. We consider that low-level features also play an important role instead of only utilizing top-level representations. Another significant strategy is knowledge distillation (KD). The characteristics of KD can help us get better performance of an embedding network to extract features. Finally, a distance function is employed to classify unlabeled samples. Comprehensive experiments are conducted on few-shot benchmarks. Our method achieves promising performances. The results demonstrate that KD and future fusion are beneficial to gain an expected embedding network for few-shot classification tasks.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.