Abstract

Training noise-robust deep neural networks (DNNs) in label noise scenario is a crucial task. In this paper, we first demonstrates that the DNNs learning with label noise exhibits over-fitting issue on noisy labels because of the DNNs is too confidence in its learning capacity. More significantly, however, it also potentially suffers from under-learning on samples with clean labels. DNNs essentially should pay more attention on the clean samples rather than the noisy samples. Inspired by the sample-weighting strategy, we propose a meta-probability weighting (MPW) algorithm which weights the output probability of DNNs to prevent DNNs from over-fitting to label noise and alleviate the under-learning issue on the clean sample. MPW conducts an approximation optimization to adaptively learn the probability weights from data under the supervision of a small clean dataset, and achieves iterative optimization between probability weights and network parameters via meta-learning paradigm. The ablation studies substantiate the effectiveness of MPW to prevent the deep neural networks from overfitting to label noise and improve the learning capacity on clean samples. Furthermore, MPW achieves competitive performance with other state-of-the-art methods on both synthetic and real-world noises.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.