Abstract

Recent research efforts on Few-Shot Learning (FSL) have achieved extensive progress. However, the existing efforts primarily focus on the transductive setting of FSL, which is heavily challenged by the limited quantity of the unlabeled query set. Although a few inductive-based FSL methods have been studied, most of them emphasize learning superb feature extraction networks. As a result, they may ignore the relations between sample-level and class-level representations, which are particularly crucial when labeled samples are scarce. This paper proposes an inductive FSL framework that leverages the Hierarchical Knowledge Propagation and Distillation, named HKPD. To learn more discriminative sample-level representations, HKPD first constructs a sample-level information propagation module that explores pairwise sample relations. Subsequently, a class-level information propagation module is designed to obtain and update the class-level information. Moreover, a self-distillation module is adopted to further improve the learned representations by propagating the obtained knowledge across this hierarchical architecture. Extensive experiments conducted on the commonly used few-shot benchmark datasets demonstrate the superiority of the proposed HKPD method, which outperforms the current state-of-the-art methods.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.