Abstract

In recent years, metric-based meta-learning methods have received widespread attention because of their effectiveness in solving few-shot classification problems. However, the scarcity of data frequently results in suboptimal embeddings, causing a discrepancy between anticipated class prototypes and those derived from the support set. These problems severely limit the generalizability of such methods, necessitating further development of Few-Shot Learning (FSL). In this study, we propose the Contrastive Prototype Network (CPN) consisting of three components: (1) Contrastive learning proposed as an auxiliary path to reduce the distance between homogeneous samples and amplify the differences between heterogeneous samples, thereby enhancing the effectiveness and quality of embeddings; (2) A pseudo-prototype strategy proposed to address the bias in prototypes, whereby the pseudo prototypes generated using query set samples are integrated with the initial prototypes to obtain more representative prototypes; (3) A new data augmentation technique, mixupPatch, introduced to alleviate the issue of insufficient data samples, whereby enhanced images are generated by blending the images and labels from different samples, to increase the number of samples. Extensive experiments and ablation studies conducted on five datasets demonstrated that CPN achieves robust results against recent solutions.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.