Abstract

The dramatic increase in dataset volumes available to train learning models has led to great advances in machine learning, but at the cost of slowing down training. This paper addresses the effect of data reduction on speeding up training while keeping or improving the accuracy performance of classification. Since many studies have focused on feature selection, but did not adequately consider instance selection, our work focuses on both instance reduction and feature reduction, integrated into a holistic reduction approach. We examined in prior work Simple Random Sample Selection without Replacement, integrated with the Information Gain-based Feature Selection method, and compared its performance with the unintegrated instance selection and feature selection individually, applied at various reduction rates. Our results proved that the integration of instance and feature selection performed much better than instance or feature selection alone, in terms of both training speedup and even accuracy improvement. In this paper, a novel transpose-based instance selection approach is introduced and integrated with feature selection, and its performance is investigated on classification and compared with the top-performing results of our prior work. Our results show that our new integrated method leads to significant increase in training speedup without impacting classification accuracy, and in fact, for some classifiers like Naive Bayes, the accuracy goes up considerably.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call