Abstract
Three-dimensional maps are useful for many applications; from the gaming industry, to augmented reality, to the development of tour guides of important landmarks such as museums or university campuses. The generation of such maps is very labour intensive and has therefore justified its automation using robots with range sensors such as lasers or cameras. This paper presents an automated 3D reconstruction system for indoor environments, which relies on a vision-based occupancy-grid SLAM (Simultaneous Localization and Mapping) to detect the ground. The novelty in our work is the method in which 3D information is extracted and fed to SLAM. Initially, coherent sections in the scene are segmented using a graph-cut algorithm, next 3D points extracted via a stereo camera are used to fit planes to each section. The ground plane is then determined based on the orientation of its normal and virtual rays are cast into the field of view from the camera center to the intersection of each ray's 2D projection with the Ground boundaries. Dense depth information can then be suggested from these rays and inputted to SLAM. Walls and ceiling are also built in a heuristic manner by satisfying normality constraints and keeping within the boundaries of Ground. Our system produces high-quality maps and reduces the high computational cost of dense stereo matching by processing only a sparse set of highly reliable salient features. Experiments are conducted inside a lab setting and results prove the success of the system.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.