Abstract

Real-world classification often encounters a problem called class imbalance. When the data of some classes are redundant than that of other classes, traditional classifiers usually bias their decision boundaries to the redundant majority classes. Most proposed cost-sensitive strategies often ignore the hard-to-learn examples or have a large amount of hyper-parameters. This article proposes an adaptive learning cost-sensitive convolutional neural network to solve this problem. During the training process, the proposed method embeds a class-dependent cost to each class in the global error, making the decision boundary bias to the minority classes. Meanwhile, a distribution weight is assigned to each example to enhance the learning of the hard-to-learn examples. Both the class-dependent costs and distribution weights are learnt automatically in the net. This cost-sensitive approach makes the algorithm focus on the examples in the minority classes as well as the hard-to-learn examples in each class. Besides, this approach can be applied to both binary and multi-class image classification problems without any modification. Experiments are conducted on four image classification datasets to evaluate this algorithm. The experimental results show that the proposed method achieves better performance than the baseline algorithms and some other algorithms.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.