Abstract

Strain-based structural health monitoring technology has been widely used in the field of transportation. The existing strain damage identification methods have defects such as complex process, lag in state evaluation, and low intelligence. This paper adopts the deep learning method to establish a network model that uses the strain field information to map directly to the damage information, and takes a subway bolster as the engineering background to realise the end-to-end automatic damage identification. Firstly, the problem of damage identification in strain field is described, combined with the idea of fully convolutional network. The basic structure of damage identification network is modularised, and the overall design framework is proposed. Then, the damage simulation method is determined, and the feasibility of using this method to construct a strain field damage dataset is verified. The batch random damage model generation and the random noise signal addition program are coded, and the datasets of the bolster under static/dynamic force are obtained. Finally, the deep learning model is applied to the bolster damage dataset, and a residual module BolRes_Att that integrates spatial attention and channel attention mechanism is proposed. It has better damage identification performance without an increase in model parameters. The average number of faulty elements on the two bolster test sets of the improved damage identification network is 3.02 and 2.92, respectively, accounting for about 0.016% of all elements. The average processing time for a set of data is only 0.014 s. The results show that the deep learning model constructed in this paper can accurately and quickly identify the damage information of elements according to the strain field information.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.