Abstract

Multi-view imbalanced learning is to handle the datasets with multi-view representations and imbalanced classes. Existing multi-view imbalanced learning methods can be divided into two main categories: multi-view ensemble learning and multi-view cost-sensitive learning. However, these methods suffer from the following problems: 1) neglecting either consensus or complementary information, 2) complex preprocessing and information fusion in multi-view ensemble learning and manual assignment of misclassification costs in multi-view cost-sensitive learning, and 3) limited ability to handle noisy samples. Therefore, we aim to design a concise and unified framework to grapple with the multi-view representations, imbalanced classes and noisy samples simultaneously. Inspired by the merits of support vector machine (SVM) and quadratic type squared error (QTSE) loss function, we propose a robust multi-view instance-level cost-sensitive SVM with QTSE loss (MVQS) for imbalanced data classification. The consensus regularization term and combination weight strategy are employed to fully exploit multi-view information. The QTSE loss can adaptively impose instance-level penalties to the misclassification of samples, and make MVQS be robust to noisy samples. We solve MVQS with the alternating direction method of multipliers (ADMM) and the gradient descent (GD) algorithm. Comprehensive experiments validate that MVQS is more competitive and robust than other benchmark approaches.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call