Abstract

The training data commonly used in software defect prediction (SDP) usually contains some instances that have similar values on features but are in different classes, which significantly degrades the performance of prediction models trained using these instances. This is referred to as the class overlap problem (COP). Previous studies have concluded that COP has a more negative impact on the performance of prediction models than the class imbalance problem (CIP). However, less research has been conducted on COP than CIP. Moreover, the performance of the existing class overlap cleaning techniques heavily relies on the settings of hyperparameters such as the value of K in the K-nearest neighbor algorithm or the K-means algorithm, but how to find those optimal hyperparameters is still a challenge. In this study, we propose a novel technique named the radius-based class overlap cleaning technique (ROCT) to better alleviate COP without tuning hyperparameters in SDP. The basic idea of ROCT is to take each instance as the center of a hypersphere and directly optimize the radius of the hypersphere. Then ROCT identifies those instances with the opposite label of the center instance as the overlapping instance and removes them. To investigate the performance of ROCT, we conduct the empirical experiment across 29 datasets collected from various software repositories on the K-nearest neighbor, random forest, logistic regression, and naive Bayes classifiers measured by AUC, balance, pd, and pf. The experimental results show that ROCT performs the best and significantly improves the performance of prediction models by as much as 15.2% and 29.9% in terms of AUC and balance compared with the existing class overlap cleaning techniques. The superior performance of ROCT indicates that ROCT should be recommended as an efficient alternative to alleviate COP in SDP.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call