Abstract
Scan-to-BIM technology can help construction stakeholders to obtain accurate BIM models for building information management and construction progress monitoring. The current practice of scan-to-BIM relies on laser scanning to capture a building's as-built condition, followed by processing scanned point clouds to generate BIM, which is laborious and time-consuming. This study presents a new robot-assisted mobile laser scanning approach using a legged robot and a solid-state LiDAR sensor, for automated 3D reconstruction and point cloud semantic segmentation. The proposed new approach involves integrating the Simultaneous Localisation and Mapping (SLAM) algorithm with robot motion control and pathfinding for 3D mapping to obtain the complete point cloud data. To ensure the completeness and density of scanned point clouds, new scanning fitness metrics and a grid-based direction-by-direction algorithm are designed for robot pathfinding. Then, an enhanced dynamic window approach is established to integrate the motion control of the legged robot with SLAM for reconstructing dense point clouds. Lastly, semantic segmentation is performed using ResPointNet++ to segment architectural and structural components. Experimental results show that the mobile scanning approach achieves a decent performance in point cloud completeness and density, with an overall mIoU of 81.75% for semantic segmentation. The ablation test demonstrates that the proposed approach is superior to the method that does not integrate the robot motion with SLAM. The proposed approach contributes to the efficient collection of high-quality point clouds for building interiors, with an aim to boost building information management.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.