Abstract
In this paper, we present a control architecture for an intelligent outdoor mobile robot. This enables the robot to navigate in a complex, natural outdoor environment, relying on only a single on-board camera as sensory input. This is achieved through a twofold analysis of the visual data stream: a dense structure from motion algorithm calculates a depth map of the environment and a visual simultaneous localization and mapping algorithm builds a map of the surroundings using image features. This information enables a behavior-based robot motion and path planner to navigate the robot through the environment. In this paper, we show the theoretical aspects of setting up this architecture.
Highlights
In this paper, we present a control architecture for an intelligent outdoor mobile robot
This is achieved through a twofold analysis of the visual data stream: a dense structure from motion algorithm calculates a depth map of the environment and a visual simultaneous localization and mapping algorithm builds a map of the surroundings using image features
Relevant shortcomings of this problem are, on the one hand, the computational burden, which limits the applicability of the Extended Kalman Filer (EKF)-based Simultaneous Localization and Mapping (SLAM) in large-scale real time applications and, on the other hand, the use of linearized solutions which compromises the consistency of the estimation process
Summary
The basis for feature-based approaches lies in the early work of Longuet-Higgins (Longuet-Higgins, H.C., 1981), describing how to use the epipolar geometry for the estimation of relative motion These techniques have matured a lot over the past two decades, but a remaining problem is that they deliver only sparse 3D information. In order to bring together the advantages of both sparse and dense SfM theorems, we aim to fuse both methods into an integrated structure recovery algorithm
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.