Abstract

Robust few-shot learning (RFSL), which aims to address noisy labels in few-shot learning, has recently gained considerable attention. Existing RFSL methods are based on the assumption that the noise comes from known classes (in-domain), which is inconsistent with many real-world scenarios where the noise does not belong to any known classes (out-of-domain). We refer to this more complex scenario as open-world few-shot learning (OFSL), where in-domain and out-of-domain noise simultaneously exists in few-shot datasets. To address the challenging problem, we propose a unified framework to implement comprehensive calibration from instance to metric. Specifically, we design a dual-networks structure composed of a contrastive network and a meta network to respectively extract feature-related intra-class information and enlarged inter-class variations. For instance-wise calibration, we present a novel prototype modification strategy to aggregate prototypes with intra-class and inter-class instance reweighting. For metric-wise calibration, we present a novel metric to implicitly scale the per-class prediction by fusing two spatial metrics respectively constructed by the two networks. In this way, the impact of noise in OFSL can be effectively mitigated from both feature space and label space. Extensive experiments on various OFSL settings demonstrate the robustness and superiority of our method. Our source codes is available at https://github.com/anyuexuan/IDEAL.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.