Just-in-Time Software Defect Prediction (JIT-SDP) operates in an online scenario where additional training data is received over time. Existing online JIT-SDP studies used online Oza ensemble learning methods with Hoeffding Trees as base learners to learn and update JIT-SDP models over time in this scenario. However, it is unknown how these approaches compare against offline learning approaches adapted to operate in online scenarios, and how the use of any other online or offline base learners would affect online JIT-SDP in terms of predictive performance and computational cost. We therefore propose a new approach called Batch Oversampling Rate Boosting (BORB) that is able to use offline base learners in an online JIT-SDP scenario. Based on 10 open source projects, we provide a comprehensive evaluation of BORB with 5 different base learners and the existing online approach Oversampling Rate Boosting with 4 different base learners, both in within-project and cross-project online JIT-SDP scenarios. The results show that offline learning can lead to better predictive performance than the top performing online learning approaches considered in our study, at a higher computational cost. Cross-project data was helpful to improve predictive performance both for offline and online learning, but especially for online learning.
Read full abstract