Abstract

Deep neural networks (DNNs) have achieved impressive success in a variety of classification tasks. However, the presence of noisy labels in training dataset adversely affects the performance of DNNs. Recently, numerous noise-robust loss functions have been proposed to combat the noisy label problem. However, we find that these loss functions are either slow to learn the potential data pattern or not sufficiently robust against noisy labels. Here, we propose an improved categorical cross entropy (ICCE) to deal with this challenge. The ICCE can automatically adjust the weighting scheme based on the predicted probability distribution of DNNs by an exponential item, which makes it gain strong noise robustness and fast learning ability. A theoretical analysis of the ICCE is presented in the context of noisy labels. Experiments on datasets indicate that the ICCE can better improve the performance of DNNs even under high-level noise.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call