Abstract

Structural damage detection plays an important part in structural health monitoring for engineering structures. However, monitored signals are easily polluted by noise and the damaged data are difficult to obtain. In this work, a novel structural damage detection approach using multisensor spatial–temporal graph-based features and deep graph convolutional networks (DGCNs) is presented. The spatial–temporal graph is constructed by the graph theory based on continuous wavelet transform (CWT) of vibration signals. Then, the multisensor spatial–temporal graph-based feature is extracted based on the Laplacian matrix derived from the spatial–temporal graph of the multisensor data. To overcome the limitation of small data size which obstructed the use of the artificial neural network and convolutional neural network, a DGCN is utilized to classify the damage type of the monitored structure. The extracted multisensor spatial–temporal graph-based feature vector is used to represent the node of the global graph as the input of the DGCN. The node with the same condition of the structure can be classified by using the well-trained DGCN. Experiments of the International Association for Structural Control (IASC)-American Society of Civil Engineers (ASCE) SHM benchmark structure and Qatar steel frame structure in the laboratory are performed to verify the effectiveness of the proposed approach. The experimental results show that the DGCN method can be used to detect structural damage by learning from the constructed global graphs. Comparative experiments demonstrate that the proposed approach performs better than the conventional approach, especially for the limited dataset and noise-polluted case.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.