Abstract

Automatic registration of point clouds captured by terrestrial laser scanning (TLS) plays an important role in many fields including remote sensing (e.g., transportation management, 3-D reconstruction in large-scale urban areas and environment monitoring), computer vision, and virtual reality and robotics. However, noise, outliers, nonuniform point density, and small overlaps are inevitable when collecting multiple views of data, which poses great challenges to 3-D registration of point clouds. Since conventional registration methods aim to find point correspondences and estimate transformation parameters directly in the original point space, the traditional way to address these difficulties is to introduce many restrictions during the scanning process (e.g., more scanning and careful selection of scanning positions), thus making the data acquisition more difficult. In this article, we present a novel 3-D registration framework that performs in a “middle-level structural space” and is capable of robustly and efficiently reconstructing urban, semiurban, and indoor scenes, despite disturbances introduced in the scanning process. The new structural space is constructed by extracting multiple types of middle-level geometric primitives (planes, spheres, cylinders, and cones) from the 3-D point cloud. We design a robust method to find effective primitive combinations corresponding to the 6-D poses of the raw point clouds and then construct hybrid-structure-based descriptors. By matching descriptors and computing rotation and translation parameters, successful registration is achieved. Note that the whole process of our method is performed in the structural space, which has the advantages of capturing geometric structures (the relationship between primitives) and semantic features (primitive types and parameters) in larger fields. Experiments show that our method achieves state-of-the-art performance in several point cloud registration benchmark datasets at different scales and even obtains good registration results for data without overlapping areas.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call