Abstract
Deep learning networks facilitate automated damage identification and performance evaluation for concrete structures using electromechanical impedance/admittance (EMI/EMA) technique, while data quantity and quality limit the performance of such a data-driven network. For the first time, this paper proposed a data-augmentation approach using deep-convolutional admittance generative adversarial networks (AdmiGAN) to solve data deficiency and measurement inefficiency for deep learning-based flexural performance evaluation of reinforced concrete (RC) structures. In the approach, a new data normalization procedure was developed to collaboratively foster AdmiGAN-based EMA data synthesis, and synthetic datasets were fed into an adaptive convolutional neural network (CNN) for deep learning. Proof-of-concept experiment was conducted on a four-point bending RC beam structure, which was continuously monitored from initial loading to final failure by three surface-bonded piezoelectric ceramic lead zirconate titanate (PZT) patches. Qualitative detection of stress and damage was performed by traditional feature analysis of EMA signatures, automated performance evaluation was attempted by using CNN approach. Results demonstrated that the AdmiGAN required merely 5 groups of EMA signatures to generate high-accuracy dataset with 174 times of speed faster than conventional measurement method, and the AdmiGAN cooperated with CNN provided a new paradigm of data-driven structural performance evaluation with high accuracy, efficiency, and intelligence.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.