Abstract

Unmanned Aerial Systems (UAS) research and practice for bridge evaluations have attracted significant interest from government and private inspection agencies. Using UAS equipped with Infrared Thermography (IRT) cameras can help inspectors localize subsurface defects, such as delamination. Deep learning models can potentially generate robust defect maps; however, the lack of reliable annotated IRT datasets has hindered the development and application of deep learning to bridge deck evaluations. The authors present a novel and generic pipeline for the autonomous annotation of reinforced concrete bridge deck IRT images obtained using UAS. We used IRT images of five in-service concrete bridge decks before they were repaired. A map was generated for each deck to indicate the size and location of repaired delaminations using chain dragging and destructive testing, which is a common practice for deck evaluation. We then devised a generic methodology to assign each pixel in the IRT image to its corresponding location on the delamination map. The authors hypothesized that using feature detector-free approaches with deep layers of feature matching will increase accuracy in lieu of conventional methods; therefore, we used a detector-free approach, Local Feature Transformer (LoFTR), to extract rich features in the IRT data, which resulted in high-quality stitched map generation. The stitched maps were matched with the delamination maps for each bridge deck to annotate each pixel in the stitched map based on the actual defect state and to trace each pixel in the stitched map back to its location in the original image. The proposed method resulted in a 13x increase in annotated pixels compared to the annotated stitched image using Agisoft, a commercial stitching software.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call