Abstract

XGBoost is a promising machine learning method capable of predicting essential concrete properties and enhancing advanced concrete design. However, its underlying version still requires further study and development. In this investigation, the effectiveness of advanced XGBoost versions, including Ada-XGBoost, Bagging-XGBoost, Stacking-XGBoost, and Voting-XGBoost, to predict the compressive strength (CS) of Ultra-high-performance concrete (UHPC) was accessed. A database covering 810 results from in the literature, including 15 inputs, such as 12 UHPC components, two curing conditions, and sample age, was utilized for training the models. The performance criteria for the five models, including RMSE, MAE, and R2, were evaluated using a combination of 10-Fold CV and Monte Carlo (MC) simulation. The results showed that the Stacking-XGBoost and XGBoost models outperformed other models in terms of prediction accuracy for the CS of UHPC. Based on SHAP values analysis, features such as age, fiber, slag, cement, sand, superplasticizer, water, relative humidity, and temperature were identified as the key parameters affecting UHPC’s CS. Furthermore, a quantitative analysis of their combined impact on UHPC's CS was also provided.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call