Abstract

For class-imbalance problems, traditional supervised learning algorithms tend to favor majority instances (also called negative instances). Therefore, it is difficult for them to accurately identify the minority instances (also called positive instances). Ensemble learning is a common method to solve the class-imbalance problem. They build multiple classifier systems on the training dataset to improve the recognition accuracy of minority instances. Sliding window is a commonly used method for processing data stream. Few researchers have used sliding windows to select majority instances and construct ensemble learning models. Traditional ensemble learning methods use some or all of the majority instances for modeling by oversampling or undersampling. However, they also inherit the drawbacks of the preprocessing methods. Therefore, in this paper, we try to use similarity mapping to construct pseudo-sequences of majority instances. Then, according to the sliding window idea, we fully use all existing majority instances, and a novel sliding window-based selective ensemble learning method (SWSEL) is proposed to deal with the class-imbalance problem. This method uses the idea of distance alignment in multi-view alignment to align the centers of the minority instances with the majority instances, and slide to select the majority instances on the sequence of pseudo-majority instances. In addition, to prevent too many classifiers from leading to long running times, we use distance metric to select a certain number of base classifiers to build the final ensemble learning model. Extensive experimental results on various real-world datasets show that using SVM, MLP and RF as the base classifier, SWSEL achieves a statistically significant performance improvement on two evaluation metrics, AUC and G-mean, compared to state-of-the-art methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call