Abstract

AbstractThree‐dimensional (3D) object detection, that is, localizing and classifying all critical objects in a 3D space, is essential for downstream construction scene analysis tasks. However, accurate instance segmentation, few 2D object segmentation and 3D object detection data sets, high‐quality feature representations for depth estimation, and limited 3D cues from a single red‐green‐blue (RGB) image pose significant challenges to 3D object detection and severely hinder its practical applications. In response to these challenges, an improved cascade‐based network with a transformer backbone and a boundary‐patch‐refinement method is proposed to build hierarchical features and refine object boundaries, resulting in better results in 2D object detection and instance segmentation. Furthermore, a novel self‐supervised monocular depth learning method is proposed to extract better feature representations for depth estimation from construction site video data with unknown camera parameters. Additionally, a pseudo‐LiDAR point cloud method and a 3D object detection method with a density‐based clustering algorithm are proposed to detect 3D objects in a construction scene without help from 3D labels, which will serve as a good foundation for other downstream 3D tasks. Finally, the proposed model is evaluated for object instance segmentation and depth estimation on the moving objects in construction sites (MOCS) and construction scene data sets. It brings a 9.16% gain in terms of mean average precision (mAP) for object detection and a 4.92% gain in mask mAP for object instance segmentation. The average order accuracy and relative mean error for depth estimation are improved by 0.94% and 60.56%, respectively. This study aims to overcome the challenges and limitations of 3D object detection and facilitate practical applications in construction scene analysis.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.