Abstract

Software aging refers to the problem of the deteriorated performance and increased failure rate in the long-running software systems. Typically, aging-related bugs (ARBs) that are activated due to the runtime garnering of error conditions caused the software aging problem. ARBs are difficult to detect when software testing is carried out. Therefore, early identification of these bugs can help in building a robust software system. However, one major issue in ARBs prediction is the skewness of the dataset (class imbalance problem) that may cause bias in the learning of classification algorithms and thus may lead to the higher misclassification rate. This paper aims to investigate the effect of instance filtering (resampling) and standardization techniques with various classification algorithms when predicting aging-related bugs in the software system. The experimental study is performed by using four different classification algorithms namely logistic regression, support vector classifiers (SVC), random forest, and artificial neural network (ANN) with Softmax function for three different datasets available in the open-source software repository. Results of the analysis show an increment in the performance of the used classification algorithms with reduced misclassification rate.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call