Abstract

Extreme learning machines (ELMs) has been theoretically and experimentally proved to achieve promising performance at a fast learning speed for supervised classification tasks. However, it does not perform well on imbalanced binary classification tasks and tends to get biased toward the majority class. Besides, since a large amount of training data with labels are not always available in the real world, there is an urgent demand to develop an efficient semi-supervised version of ELM for imbalanced binary classification tasks. In this article, owing to the distinct insensitivity of area under the ROC curve (AUC) to both class skews and changes of class distributions, we focus the study on integrating AUC maximization into the ELM framework to tackle with imbalanced binary classification tasks well. By demystifying the AUC metric with the ELM framework, we develop a new AUC-based ELM called AUC-ELM for imbalanced binary classification, which essentially is revealed to be equivalent to an ELM on another transformed data space. Accordingly, its semi-supervised version called SAUC-ELM is also developed. Both AUC-ELM and SAUC-ELM have the distinctive merits: 1) they share the advantage of ELM in both generalization capability and training efficiency, and further uniquely tailored for imbalanced binary classification tasks and 2) in contrast to the existing imbalanced variants of ELM, such as class-specific cost regulation ELM and semi-supervised ELM, they have fewer parameters to tune, thereby reducing the computational cost for model selection. Experiments on a heap of datasets show that both AUC-ELM and SAUC-ELM outperform the other comparative methods in terms of both classification performance and training speed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call