Abstract

Binary class imbalance problem refers to the scenario where the number of training samples in one class is much lower compared with the number of samples in the other class. This imbalance hinders the applicability of conventional machine learning algorithms to classify accurately. Moreover, many real world training datasets often fall in the category where data is not only imbalanced but also low-resourced. In this paper we introduce a novel technique to handle the class imbalance problem, even in low-resource scenarios. In our approach, instead of, as is common, learning using one sample at a time, two samples are simultaneously considered to train the classifier. The simultaneous two-sample learning seems to help the classifier learn both intra- and inter-class properties. Experiments conducted on a large number of benchmarked datasets demonstrate the enhanced performance of our technique over the existing state of the art techniques.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call