Abstract

In the study of image classification, neural network learning relies heavily on datasets. Due to variability in the difficulty of collecting images in reality, datasets tend to have class imbalance problems, which undoubtedly increases the difficulty of classification. During the training of a neural network, classes with a large number of images are naturally trained more often than classes with a small number of images. Because of imbalanced training, the classification ability of neural networks on test and validation sets differs greatly in different categories. The test results of more training classes are better, and the test results of classes with less training are poor. In this paper, we propose two kinds of balanced loss functions, namely, CEFL loss and CEFL2 loss, by rebalancing the cross-entropy loss function and focal loss function. The experimental results show that the proposed loss functions are significantly able to improve classification accuracy on class-imbalanced datasets.

Highlights

  • In the past decade, deep learning methods have been widely used in image classification, target detection, voice recognition and other related fields

  • A bold idea arises: based on the characteristics of cross-entropy loss with regard to higher loss on well-classified samples, we can add a cross-entropy loss item to focal loss so that the neural network can better distinguish between both easy samples and hard samples. Aiming at solving this problem, we introduce two kinds of loss functions, namely CEFL loss and CEFL2 loss, based on cross-entropy loss and focal loss to further increase the loss for easy samples and hard samples

  • One of interesting problems is how to assign weight distribution for major and minor classes automatically. Motivated by this idea of weight distribution, we proposed two loss functions called CEFL loss and CEFL2 loss to address the problem of training from imbalanced date samples by rebalancing cross-entropy loss and focal loss to expand the loss value of well-classified and poorly classified examples

Read more

Summary

INTRODUCTION

Deep learning methods have been widely used in image classification, target detection, voice recognition and other related fields. Our main contributions can be summarized as follows: (1) Based on the existing cross-entropy loss function and focal loss function, we propose two new balanced loss functions, CEFL loss and CEFL2 loss, which help to improve the image classification accuracy of class-imbalanced datasets.

RELATED WORK
FOCAL LOSS
EXPERIMENTS
CONCLUSION
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.