Abstract

Manual visual inspection is the most common means of assessing the condition of civil infrastructure in the United States, but can be exceedingly laborious, time-consuming, and dangerous. Research has focused on automating parts of the inspection process using unmanned aerial vehicles for image acquisition, followed by deep learning techniques for damage identification. Existing deep learning methods and datasets for inspections have typically been developed for a single damage type. However, most guidelines for inspections require the identification of multiple damage types and describe evaluating the significance of the damage based on the associated material type. Thus, the identification of material type is important in understanding the meaning of the identified damage. Training separate networks for the tasks of material and damage identification fails to incorporate this intrinsic interdependence between them. We hypothesize that a network that incorporates such interdependence directly will have a better accuracy in material and damage identification. To this end, a deep neural network, termed the material-and-damage-network (MaDnet), is proposed to simultaneously identify material type (concrete, steel, asphalt), as well as fine (cracks, exposed rebar) and coarse (spalling, corrosion) structural damage. In this approach, semantic segmentation (i.e., assignment of each pixel in the image with a material and damage label) is employed, where the interdependence between material and damage is incorporated through shared filters learned through multi-objective optimization. A new dataset with pixel-level labels identifying the material and damage type is developed and made available to the research community. Finally, the dataset is used to evaluate MaDnet and demonstrate the improvement in pixel accuracy over employing independent networks.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.