Abstract

The performance of the support vector machine classification model is prone to the class imbalance problem, which occurs when one class of data severely outnumbers the other class. Traditionally, this issue could be addressed by balancing class distributions with sampling methods. This paper explores and applies the probabilistic active learning StatQSVM Mitra et al., 2004 strategy for yielding balanced class distributions from large scale unbalanced datasets. Rather than querying the instances based on their proximity, StatQSVM selects a set of instances based on locally defined confidence factor with respect to current hyperplane that models the class separation. The explorative study on StatQSVM is carried out using simulated as well as real-world unbalanced datasets. Performance deterioration was observed at high class imbalance settings within the study. To overcome this problem, a fast probabilistic cost weighted undersampling approach, called CStatQSVM with a new stopping criterion is proposed. The experimental results show that the CStatQSVM is successful on minority as well as majority class prediction as compared to LOB, StatQSVM active learning methods and other conventional methods that address class imbalance problem.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call