Abstract
Learning from imbalanced data is an important and common problem. Many methods have been proposed to address and attempt to solve the problem, including sampling and cost-sensitive learning. This paper presents an effective wrapper approach incorporating the evaluation measure directly into the objective function of cost-sensitive neural network to improve the performance of classification, by simultaneously optimizing the best pair of feature subset, intrinsic structure parameters and misclassification costs. The optimization is based on Particle Swarm Optimization. Our designed method can be applied on the binary class and multi-class classification. Experimental results on various standard benchmark datasets show that the proposed method is effective in comparison with commonly used sampling techniques.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.