Abstract

This paper proposes a method of registering point clouds using 2D images, 3D point clouds, and their correspondences in order to provide appropriate initial conditions for 3D fine registration algorithms such as Iterative Closest Point. Many commercially available optical 3D scanners capture both 3D point clouds and 2D images, and their correspondences can be obtained using camera calibration information. The proposed method registers 3D source data (moving) to 3D reference data (fixed) in an iterative manner, with each iteration consisting of three steps: (1) finding image correspondences in the source and reference images, (2) transforming the source data using the corresponding 3D points, (3) generating a virtual image of the source data in the transformed coordinates. The above steps are repeated until the source data approaches suitable initial conditions for fine registration. The proposed method has been tested on various objects, including mechanical parts, animals, and cultural items.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.