Abstract

Learning from imbalanced data is an important and common problem. Many methods have been proposed to address and attempt to solve the problem, including sampling and cost-sensitive learning. This paper presents an effective wrapper approach incorporating the evaluation measure directly into the objective function of cost-sensitive neural network to improve the performance of classification, by simultaneously optimizing the best pair of feature subset, intrinsic structure parameters and misclassification costs. The optimization is based on Particle Swarm Optimization. Our designed method can be applied on the binary class and multi-class classification. Experimental results on various standard benchmark datasets show that the proposed method is effective in comparison with commonly used sampling techniques.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call