Abstract
With the development of computer vision technologies, 3D reconstruction has become a hotspot. At present, 3D reconstruction relies heavily on expensive equipment and has poor real-time performance. In this paper, we aim at solving the problem of 3D reconstruction of an indoor scene with large vertical span. In this paper, we propose a novel approach for 3D reconstruction of indoor scenes with only a Kinect. Firstly, this method uses a Kinect sensor to get color images and depth images of an indoor scene. Secondly, the combination of scale-invariant feature transform and random sample consensus algorithm is used to determine the transformation matrix of adjacent frames, which can be seen as the initial value of iterative closest point (ICP). Thirdly, we establish the relative coordinate relation between pair-wise frames which are the initial point cloud data by using ICP. Finally, we achieve the 3D visual reconstruction model of indoor scene by the top-down image registration of point cloud data. This approach not only mitigates the sensor perspective restriction and achieves the indoor scene reconstruction of large vertical span, but also develops the fast algorithm of indoor scene reconstruction with large amount of cloud data. The experimental results show that the proposed algorithm has better accuracy, better reconstruction effect, and less running time for point cloud registration. In addition, the proposed method has great potential applied to 3D simultaneous location and mapping.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.