Abstract

A 3D reconstruction method using feature points is presented and the parameters used to improve the reconstruction are discussed. The precision of the 3D reconstruction is improved by combining point clouds obtained from different viewpoints using structured light. A well-known algorithm for point cloud registration is the ICP (Iterative Closest Point) that determines the rotation and translation that, when applied to one of the point clouds, places both point clouds optimally. The ICP algorithm iteratively executes two main steps: point correspondence determination and registration algorithm. The point correspondence determination is a module that, if not properly executed, can make the ICP converge to a local minimum. To overcome this drawback, two techniques were used. A meaningful set of 3D points using a technique known as SIFT (Scale-invariant feature transform) was obtained and an ICP that uses statistics to generate a dynamic distance and color threshold to the distance allowed between closest points was implemented. The reconstruction precision improvement was implemented using meaningful point clouds and the ICP to increase the number of points in the 3D space. The surface reconstruction is performed using marching cubes and filters to remove the noise and to smooth the surface. The factors that influence the 3D reconstruction precision are here discussed and analyzed. A detailed discussion of the number of frames used by the ICP and the ICP parameters is presented.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.