Abstract

AbstractDetecting and measuring the damage on historic glazed tiles plays an important role in the maintenance and protection of historic buildings. However, the current visual inspection method for identifying and assessing superficial damage on historic buildings is time and labor intensive. In this article, a novel two‐level object detection, segmentation, and measurement strategy for large‐scale structures based on a deep‐learning technique is proposed. The data in this study are from the roof images of the Palace Museum in China. The first level of the model, which is based on the Faster region‐based convolutional neural network (Faster R‐CNN), automatically detects and crops two types of glazed tile photographs from 100 roof images (2,488 × 3,264 pixels). The average precision values (AP) for roll roofing and pan tiles are 0.910 and 0.890, respectively. The cropped images are used to form a dataset for training a Mask R‐CNN model. The second level of the model, which is based on Mask R‐CNN, automatically segments and measures the damage based on the cropped historic tile images; the AP for the damage segmentation is 0.975. Based on Mask R‐CNN, the predicted pixel‐level damage segmentation result is used to quantitatively measure the morphological features of the damage, such as the damage topology, area, and ratio. To verify the performance of the proposed method, a comparative study was conducted with Mask R‐CNN and a fully convolutional network. This is the first attempt at employing a two‐level strategy to automatically detect, segment, and measure large‐scale superficial damage on historic buildings based on deep learning, and it achieved good results.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.