Abstract

AbstractConsidering the increasing need for software projects, estimating software development efforts is essential and can lead to improved project delivery quality. Machine learning methods are widely used to improve the accuracy of estimation. The boosting method is an ensemble machine learning technique less used in this field. In this research, five boosting algorithms including Adaboost, Gradient boosting, XGBoost, LightGBM, and CatBoost were implemented with the hyperparameter tuning framework Optuna on the ISBSG database. The Optuna is a next‐generation optimization method for automatically tuning hyperparameters of algorithms. Six evaluation criteria MMRE, MdMRE, MAE, MSE, Pred(0.25), and SA were used to evaluate the findings. The results show that the hyperparameter automatic tuning by Optuna increases the accuracy of prediction provided by all five models. When the Catboost algorithm uses Optuna to tune its hyperparameters has made the best prediction among the five algorithms studied in this research. Using Optuna, compared to the case where the algorithm uses its default settings, the highest percentage of prediction improvement was observed in the XGBoost algorithm (except for the SA criterion). Based on the criteria of MMRE, Pred(0.25), and SA, this study has a better prediction than some relatively similar articles.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call