Abstract
A fundamental problem in autonomous vehicle navigation is the identification of obstacle free space in cluttered and unstructured environments. Features such as walls, people, furniture, doors and stairs, etc are potential hazards. The approach taken in this paper is motivated by the recent development on infra-red time-of-flight cameras that provide video frame rate low resolution depth maps. We propose to exploit the temporal information content provided by the high refresh rate of such cameras to overcome the limitations due to low spatial resolution and high depth uncertainty and aim to provide robust and accurate estimates of planar surfaces in the environment. These surfaces’ estimates are then used to provide statistical tests to identify obstacles and dangers in the environment. Classical 3D spatial RANSAC is extended to 4D spatio-temporal RANSAC by developing spatio-temporal models of planar surfaces that incorporate a linear motion model as well as linear environment features. A 4D-vector product is used for hypotheses generation from data that is randomly sampled across both spatial and temporal variations. The algorithm is fully posed in the spatio-temporal representation and there is no need to correlate points or hypothesis between temporal images. The proposed algorithm is computationally fast and robust for estimation of planar surfaces in general and the ground plane in particular. There are potential applications in mobile robotics, autonomous vehicular navigation, and automotive safety systems. The claims of the paper are supported by experimental results obtained from real video data for a time-of-flight range sensor mounted on an automobile navigating in an undercover parking lot.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.