Abstract

Condition assessment of civil infrastructure from manual inspections can be time consuming, subjective, and unsafe. Advances in computer vision and Deep Neural Networks (DNNs) provide methods for automating important condition assessment tasks such as damage and context identification. One critical challenge towards the training of robust and generalizable DNNs for damage identification is the difficulty in obtaining large and diverse datasets. To maximally leverage available data, researchers have investigated using synthetic images of damaged structures from Generative Adversarial Networks (GANs) for data augmentation. However, GANs are limited in the diversity of data they can produce as they are only able to interpolate between samples of damaged structures in a dataset. Unpaired image-to-image translation using Cycle Consistent Adversarial Networks (CCAN) provide one means of extending the diversity and control in generated images, but have not been investigated for applications in condition assessment. We present EIGAN, a novel CCAN architecture for generating realistic synthetic images of a damaged structure, given an image of its undamaged state. EIGAN has the capability to translate undamaged images to damaged representations and vice-versa while retaining the geometric structure of the infrastructure (e.g, building shape, layout, color, size etc). We create a new unpaired dataset of damaged and undamaged building images taken after the 2017 Puebla Earthquake. Using this dataset, we demonstrate how EIGAN is able to address shortcomings of three other established CCAN architectures specifically for damage translation with both qualitative and quantitative measures. Additionally, we introduce a new methodology to explore the latent space of EIGAN allowing for some control over the properties of the generated damage (e.g., the damage severity). The results demonstrate that unpaired image-to-image translation of undamaged to damaged structures is an effective means of data augmentation to improve network performance.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.