Abstract

• Classical cross entropy loss may cause serious overfitting. • Regularization are beneficial to avoid overfitting. • Label distributions are more robust than hard labels. It is challenging to train deep neural networks robustly with noisy labels, since the deep neural networks can totally over-fit on these noisy labels. In this paper, motivated by label distribution learning, we propose a novel method named Feature-Induced Label Distribution (FILD) to deal with noisy labels. Specifically, FILD recovers label distributions by leveraging the topological structure information of feature space, where the feature representation adjusts alternately by fitting the predictive model on the recovered label distributions. Extensive experiments on CIFAR-10, CIFAR-100, and Clothing1M clearly validate the effectiveness of FILD against other compared approaches.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call