Abstract
The automatic identification of construction projects, which can be considered as complex scenes, is a technical challenge for the supervision of soil and water conservation in urban areas. Construction projects in high-resolution remote sensing images have no unified semantic definition, thereby exhibiting significant differences in image features. This paper proposes an identification method for construction projects based on the detection of detailed ground objects, which construction projects comprise, including movable slab houses, buildings under construction, dust screens, and bare soil (rock). To create the training data set, we select highly informative detailed ground objects from high-resolution remote sensing images. Then, the Faster RCNN (region-based convolutional neural network) algorithm is used to detect construction projects and the highly informative detailed ground objects separately. The merging of detection boxes and the correction of detailed ground object combinations are used to jointly improve the confidence of construction project detection results. The empirical experiments show that the accuracy evaluation indicators of this method on a data set of Wuhan construction projects outperform other comparative methods, and its AP value and F1 score reached 0.773 and 0.417, respectively. The proposed method can achieve satisfactory identification results for construction projects with complex scenes, and can be applied to the comprehensive supervision of soil and water conservation in construction projects.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.