Abstract

Nowadays, classifier combination methodsreceives great attention from machine learning researchers. It is a powerful tool to improve the accuracy of classifiers. This approach has become increasingly interesting, especially for real-world problems, which are often characterized by their imbalanced nature. The unbalanced distribution of data leads to poor performance of most of the conventional machine learning techniques. In this paper, we propose a novel weighted rough set as a Meta classifier framework for 14 classifiers to find the smallest and optimal ensemble, which maximize the overall ensemble accuracy. We propose a new entropy-based method to compute the weight of each classifier. Each classifier assigns a weight based on its contribution to classification accuracy. Thanks to the powerful reduct technique in rough set, this guarantees high diversity of the produced reduct ensembles. The higher diversity between the core classifiers has a positive impact on the performance of minority class as well as on the overall system performance. Experimental results with ozone dataset demonstrate the advantages of weighted rough set Meta classifier framework over the well-known Meta classifiers like Bagging, boosting and random forest as well as any individual classifiers.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.