Abstract

ABSTRACT Machine learning-based surrogate models have become the preferred approach for large-scale and frequent simulation tasks due to its significant improvement in computational efficiency. In order to overcome the potential effects of learning with small sample data and the challenges of multi-feature multi-task learning, we developed a novel deep learning long short-term memory (LSTM) model. Taking slope excavation displacement prediction as a case study, we employed the Latin hypercube sampling method to generate a synthetic dataset for training LSTM and other mainstream models. Experimental results demonstrate that the model's prediction accuracy decreases with a reduction in sample size, while support vector regression (SVR), back propagation neural network (BPNN), LSTM and Gaussian process regression (GPR) demonstrate a stronger resistance. It is feasible to utilise excavation features as model inputs to establish a unified multi-step excavation model, but the accuracy of the SVR model decreased by 32.5% after supplementing excavation features. Even when the sample size is less than 50, both LSTM and GPR exhibit excellent performance, achieving model R-squared and RMSE surpassing 0.99 and 0.07 mm. However, when addressing multi-output learning tasks, LSTM stands out as the optimal choice. This study will assist researchers or engineers in swiftly selecting appropriate surrogate models.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.