Abstract

The construction of large-scale civil infrastructures requires massive spatiotemporal data to support the management and control of scheduling, quality control, and safety monitoring. Existing artificial-intelligence-based data processing algorithms rely heavily on experienced engineers to adjust the parameters of data processing, which is inefficient and time-consuming when dealing with huge datasets. Limited studies have compared the performance of different algorithms on a unified dataset. This study proposes a framework and evaluation system for comparing different data processing policies for processing huge spatiotemporal data in construction quality control. The proposed method compares the combination of multiple types of algorithms involved in the processing of massive point cloud data. The performance of data processing strategies is evaluated through this framework, and the optimal point cloud processing strategies are explored based on registration accuracy and data fidelity. Results show that a reasonable choice of combinations of point cloud sampling, filtering, and registration algorithms can significantly improve the efficiency of point cloud data processing and satisfy engineering demands for data accuracy and completeness. The proposed method can be applied to the civil engineering problem of processing a large amount of point cloud data and selecting the optimal processing method.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.