Abstract
The extreme gradient boosting (XGBoost) ensemble learning algorithm excels in solving complex nonlinear relational problems. In order to accurately predict the surface subsidence caused by mining, this work introduces the genetic algorithm (GA) and XGBoost integrated algorithm model for mining subsidence prediction and uses the Python language to develop the GA-XGBoost combined model. The hyperparameter vector of XGBoost is optimized by a genetic algorithm to improve the prediction accuracy and reliability of the XGBoost model. Using some domestic mining subsidence data sets to conduct a model prediction evaluation, the results show that the R2 (coefficient of determination) of the prediction results of the GA-XGBoost model is 0.941, the RMSE (root mean square error) is 0.369, and the MAE (mean absolute error) is 0.308. Then, compared with classic ensemble learning models such as XGBoost, random deep forest, and gradient boost, the GA-XGBoost model has higher prediction accuracy and performance than a single machine learning model.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.