Abstract

Class-imbalance learning is an important research area and draws continued attention in various real-world applications for many years. Undersampling is a key method of class-imbalance learning in order to obtain a balanced class distribution, but it may discard potentially crucial samples and may be influenced by outliers or noises in imbalanced data. Multiview learning methods have shown that models trained on different views can help each other to improve their performances and robustness, but the existing imbalance learning approaches rely only on single-view samples. In this paper, we propose a multiview feature imbalance sampling method via meta self-paced learning (M2SPL) to effectively choose high-quality samples and separate adjacent features to improve the robustness of the trained model. There are two advantages of our proposed method: (1) An adaptive reweight generation process acts as a pivotal part in our M2SPL. The adaptive density-based reweight samples learning mechanism considers noisy and intractable samples to improve the robustness of model. (2) The multiview feature learning can avoid the large value of the loss function to learn a robust model from original data, and can enhance the discrimination capability of the model. Comparison with the existing sampling approaches shows that our proposed M2SPL approach significantly improves the classification performance, with increases in the F1-score and G-mean of 15.4% and 12.5%, respectively, on average. Finally, our experimental results pass the Friedman and Holm tests, indicating that our model has a significant improvement over existing methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call