Abstract
The RGB-D camera can simultaneously acquire the color and depth information of the target surface, and has been widely used in 3D modeling, machine vision and other related fields. The traditional point cloud registration algorithm only considers the geometric information, it’s operating efficiency is low and the initial value requirement is high. This paper presents a new approach to align different frames point cloud obtained by RGB-D camera, which considers visual textures and geometric information simultaneously. Firstly, detect and match feature points on RGB images, and use RANSAC algorithm to eliminate the wrong matches. Then, convert the 2d matching pairs to 3d feature point cloud based on the depth camera model, and these point pairs without deep data are deleted. Finally, calculate the camera pose parameters by performing the iterative closest point(ICP) on feature point cloud, and apply the calculated pose parameters to the whole frame data. The experimental results show that, (1) In SIFT, SURF, and ORB feature point extraction operators, ORB has the best performance for point cloud registration. (2) The proposed algorithm has a high registration accuracy, the rotation and transform estimation error are less than 0.0097 and 4.2mm respectively. (3) The algorithm also significantly improves the convergence speed, only require 0.138 seconds, and it can meet the real-time processing requirements. (4) The algorithm is insensitive to the initial values and has strong robustness.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.