Abstract

Despite excellent performance in image classification researches, the training of the deep neural networks (DNN) needs a large set of clean data with accurate annotations. The collection of a dataset is easy, but annotating the collected data is difficult on the contrary. There are many image data on the websites, which contain inaccurate annotations, but trainings on these datasets may make networks easier to over-fit noisy data and cause performance degradation. In this work, we propose an improved joint optimization framework for noise correction, which uses the Combination of Mix-up entropy and Kullback-Leibler entropy (CMKL) as the loss function. The new loss function can achieve better fine-tuning results after updating all label annotations. The experimental results on publicly available CIFAR-10 dataset and Clothing1M dataset show superior performance of our approach compared with other state-of-the-art methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call