Abstract

The upcoming digital transformation of the modern industry will principally build upon the software systems. Certainly, any software system should commit to being fully reliable and free from any deficiency such as software faults. Maintaining the aforementioned consistency is the main objective of software reliability. The long short-term memory (LSTM) networks are employed for the first time in this kind of research to forecast software faults. The one-step walk-forward validation method is used to predict the software faults. Due to the exponential nature of data, we normalized our cumulative software fault count data using Min-Max Scalar and Box-Cox Transformation methods. Each type of normalized data is fed into the LSTM networks. With the same batch size, the number of neurons and epoch parameters were regulated with different tiers of combinations. The time-series-based software fault data were trained and tested after applying Min-Max and Box-Cox data transformation methods to obtain the root means square error (RMSE) values, and then both models were compared with each other. The RMSE values of the model with the Min-Max Scaler transforming method outperform the second model built with the Box-Cox Transformation method. From our very best knowledge, the obtained RMSE value from the software fault count data using LSTM is the first of its kind. Our models clearly show that the LSTM can be used to predict software faults. We also calculated the data dispersion from the observed independent RMSE data points of each model. The quantified data dispersion value of the second model was found to be less minimal than the first one.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call