Abstract

Vibration-based damage identification methods are crucial for structural health monitoring. In recent years, deep learning methods, such as convolutional neural networks (CNNs), have been widely used for damage identification due to their superior performance. To enhance the accuracy of damage identification for complex frame structures and effectively utilize data measured from multiple sensors, this study proposes and compares four different damage identification methods based on CNN and four different kinds of multi-channel data fusion approaches, i.e., data splicing, matrix reconstruction, data dimension elevation, and sub-model integration. Firstly, the original one-dimensional vibration signals of the frame structures are measured. Then, multi-channel data fusion is performed on the signals to construct datasets used as input to train the CNN model. Finally, the sensitive damage features are automatically extracted using the trained CNN models after parameter tuning, and the structural damage patterns are identified. The effectiveness of these damage identification methods is evaluated using the numerical model of IASC-ASCE Benchmark structure, the Qatar University Grandstand Simulator test, and a three-layer stainless steel frame structure test. The efficiency of damage identification methods using multi-channel data is compared to those using only single-channel data. Additionally, the impact of noise on damage identification accuracy is also analyzed. The results demonstrate that the proposed four different damage identification methods based on CNN and multi-channel data fusion, outperform the methods that solely rely on single-channel data. Furthermore, these methods significantly enhance the accuracy of damage identification.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.