Point cloud registration is invariably an essential and challenging task in the fields of photogrammetry and computer vision to align multiple point clouds to a united reference frame. In this paper, we propose a novel global registration method using a robust phase correlation method for registration of low-overlapping point clouds, which is less sensitive to noise and outliers than feature-based registration methods. The proposed point cloud registration is achieved by converting the estimation of rotation, scaling, and translation in the spatial domain to a problem of correlating low-frequency components in the frequency domain. Specifically, it consists of three core steps: transformation from the spatial domain to the frequency domain, decoupling of rotation, scaling, and translation, and adapted phase correlation for robust shift estimation. In the first step, unstructured and unordered 3D points are transformed from the spatial domain to the frequency domain via 3D Fourier transformation, following a voxelization and binarization process. In the second step, rotation, scaling, and translation are decoupled by sequential operations, including Fourier transform, resampling strategies, and Fourier-Mellin transform. In the third step, the estimation of transformation parameters is transformed into shift estimation tasks. The shift estimation task is solved by a robust phase correlation method, in which low-frequency components are matched by decomposing the normalized cross-power spectrum and linearly fitting the decomposed signals with a closed-form solution by a ℓ1-norm-based robust estimator. Experiments were conducted using three different datasets of urban and natural scenarios. Results demonstrate the efficiency of the proposed method, with the majority of rotation and translation errors reaching less than 0.2 degree and 0.5 m, respectively. Additionally, it is also validated by experiments that the proposed method is robust to noise and versatile to datasets with wide ranges of overlaps and various geometric characteristics.
Read full abstract