Abstract

Despite the success of learning with noisy labels, existing approaches show limited performance when the noise level is extremely high, since deep neural networks (DNNs) are easily overfit to the training set with corrupted labels. In this paper, we introduce Lipschitz regularization to prevent the DNNs from over-fitting to noisy labels quickly. Meanwhile, to better detect and leverage the noisy samples, we propose a Lipschitz regularization based framework with a combination of adaptive modeling and detection module and improved semi-supervised learning. We propose to adaptively model the real distribution of the training set, and the implicit individual clean/noisy distribution, instead of parametric models. With Bayes’ rule, we then compute the posterior probability of a sample being clean, which provides a dynamic threshold for the detection of noisy labels. To reduce training instability caused by less labeled data with severe label noise, we improve the semi-supervised learning by combining the advantages of Mixup and FixMatch. It can not only increase the diversity of unlabeled samples, but also improve the generalization capability of the DNNs to avoid over-fitting. Experiments on several benchmarks demonstrate that our approach achieves comparable results with the state-of-the-art methods in the less-noisy environment, and obtains a substantial improvement (∼ 8% and ∼ 6% in accuracy on CIFAR-10 and CIFAR-100 respectively) with severe noise.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.