Abstract

This article proposes a deep sparse autoencoder framework for structural damage identification. This framework can be employed to obtain the optimal solutions for some pattern recognition problems with highly nonlinear nature, such as learning a mapping between the vibration characteristics and structural damage. Three main components are defined in the proposed framework, namely, the pre-processing component with a data whitening process, the sparse dimensionality reduction component where the dimensionality of the original input vector is reduced while preserving the required necessary information, and the relationship learning component where the mapping between the compressed dimensional feature and the stiffness reduction parameters of the structure is built. The proposed framework utilizes the sparse autoencoders based deep neural network structure to enhance the capability and performance of the dimensionality reduction and relationship learning components with a pre-training scheme. In the final stage of training, both components are jointly optimized to fine-tune the network towards achieving a better accuracy in structural damage identification. Since structural damages usually occur only at a small number of elements that exhibit stiffness reduction out of the large total number of elements in the entire structure, sparse regularization is adopted in this framework. Numerical studies on a steel frame structure are conducted to investigate the accuracy and robustness of the proposed framework in structural damage identification, taking into consideration the effects of noise in the measurement data and uncertainties in the finite element modelling. Experimental studies on a prestressed concrete bridge in the laboratory are conducted to further validate the performance of using the proposed framework for structural damage identification.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.