The cost of indoor mapping methods based on three-dimensional (3D) LiDAR can be relatively high, and they lack environmental color information, thereby limiting their application scenarios. This study presents an innovative, low-cost, omnidirectional 3D color LiDAR mapping system for indoor environments. The system consists of two two-dimensional (2D) LiDARs, six monocular cameras, and a servo motor. The point clouds are fused with imagery using a pixel-spatial dual-constrained depth gradient adaptive regularization (PS-DGAR) algorithm to produce dense 3D color point clouds. During fusion, the point cloud is reconstructed inversely based on the predicted pixel depth values, compensating for areas of sparse spatial features. For indoor scene reconstruction, a globally consistent alignment algorithm based on particle filter and iterative closest point (PF-ICP) is proposed, which incorporates adjacent frame registration and global pose optimization to reduce mapping errors. Experimental results demonstrate that the proposed density enhancement method achieves an average error of 1.5 cm, significantly improving the density and geometric integrity of sparse point clouds. The registration algorithm achieves a root mean square error (RMSE) of 0.0217 and a runtime of less than 4 s, both of which outperform traditional iterative closest point (ICP) variants. Furthermore, the proposed low-cost omnidirectional 3D color LiDAR mapping system demonstrates superior measurement accuracy in indoor environments.
Read full abstract