Abstract

Although the standard meta-learning methods have demonstrated strong performance in few-shot image classification scenarios, the models typically lack the capability to assess the reliability of their predictions, which can lead to risks in certain applications. Aiming at this problem, we first propose a meta-learning-based Evidential Deep Learning (EDL) called Meta Evidence Deep Learning (MetaEDL), which enables reliable prediction in the few-show image classification scenario. Being the same as general meta-learning methods, MetaEDL commonly employed shallow neural networks as feature extractors to avoid overfitting when dealing with few-shot samples, which significantly restricts the model’s ability to extract features. To further address this limitation, we propose a Meta Transfer Evidence Deep Learning (MetaTEDL) to address the few-shot trustworthy classification issue. MetaTEDL adopts a large-scale pre-trained neural network as its feature extractor. In the meta-training process, we only train two lightweight neuron operations Scaling and Shifting to reduce the risk of over-fitting. Then, two evidential head neural networks are trained to integrate evidence from different sources, aiming to improve the quality of the evidence output. We conduct comprehensive experiments on several challenging few-shot classification benchmarks. The results indicate that our proposed method not only outperforms other conventional meta-learning methods in terms of few-shot classification performance, but also has good UQ (uncertainty quantification), Uncertainty-guided active learning, and OOD (Out-of-Distribution) detection capabilities.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.