Abstract

Classifier learning with imbalanced data is one of the main challenges in the data mining community. An ensemble of classifiers is a popular solution to this problem, and it has acquired significant attention owing to its better performance as compared to individual classifiers. In this paper, we propose an imbalanced classification ensemble method, which is hereafter referred to as overlap and imbalanced sensitive random forest (OIS-RF). We consider the existence of overlap in imbalanced data and create a new coefficient called Hard To Learn (HTL) which aims to measure the degree of importance for each training instance. In this regard, OIS-RF focuses more on learning the instances with high importance in each sub-dataset. Furthermore, to encourage the diversity of the ensemble, a weighted bootstrap method is proposed to generate sub-datasets containing diverse local information. The proposed method is evaluated on imbalanced datasets and is supported by statistical analyses. The results show that our method outperforms 9 state-of-the-art ensemble algorithms.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.