Abstract
XGBoost is a promising machine learning method capable of predicting essential concrete properties and enhancing advanced concrete design. However, its underlying version still requires further study and development. In this investigation, the effectiveness of advanced XGBoost versions, including Ada-XGBoost, Bagging-XGBoost, Stacking-XGBoost, and Voting-XGBoost, to predict the compressive strength (CS) of Ultra-high-performance concrete (UHPC) was accessed. A database covering 810 results from in the literature, including 15 inputs, such as 12 UHPC components, two curing conditions, and sample age, was utilized for training the models. The performance criteria for the five models, including RMSE, MAE, and R2, were evaluated using a combination of 10-Fold CV and Monte Carlo (MC) simulation. The results showed that the Stacking-XGBoost and XGBoost models outperformed other models in terms of prediction accuracy for the CS of UHPC. Based on SHAP values analysis, features such as age, fiber, slag, cement, sand, superplasticizer, water, relative humidity, and temperature were identified as the key parameters affecting UHPC’s CS. Furthermore, a quantitative analysis of their combined impact on UHPC's CS was also provided.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.