Abstract

Many real-life problems can be described as imbalanced classification problems, where the number of samples belonging to one of the classes is heavily outnumbered than the numbers in other classes. The samples with larger and smaller class proportion are referred to as the majority and the minority class respectively. Traditional extreme learning machine (ELM) and Support Vector Machine (SVM) provides equal importance to all the samples leading to results biased towards the majority class. Many variants of ELM-like Weighted ELM (WELM), Boosting WELM (BWELM) etc. are designed to solve the class imbalance problem effectively. This work develops a novel UnderBagging based kernelized ELM (UBKELM) to address the class imbalance problem more effectively. In this paper, an UnderBagging ensemble is proposed which employs kernelized ELM as the component classifier. UnderBagging ensemble incorporates the strength of random undersampling and the bagging. This work creates several balanced training subsets by random undersampling of the majority class samples. The number of training subsets depending on the degree of the class imbalance. This work uses kernelized ELM as the component classifier to make the ensemble as it is stable and has the promising generalization performance. The computational cost of UBKELM is significantly lower than BWELM, BalanceCascade, EasyEnsemble, hybrid artificial bee colony WELM. The final outcome is computed by the majority voting and the soft voting of these classification models. The proposed work is assessed by employing benchmark real-world imbalanced datasets taken from the KEEL dataset repository. The experimental results show that the proposed work outperforms in contrast with the rest of the classifiers for class imbalance learning.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call