Abstract

A visual regulation strategy based on a decoupled ego-motion estimation technique, is presented for a nonholonomic mobile robot. Ego-motion in a static environment can be robustly estimated by planar region alignment, which initially detects the 2D planar motion between two frames, and the 2D motion is used to align corresponding image regions. Such a 2D registration removes all effects of the camera rotation, and the resulting residual displacement between the two aligned images is an epipolar field centered at the FOE (Focus of Expansion). Then 3D camera translation is recovered from the epipolar field. The 3D camera rotation is then derived from the recovered 3D translation and the detected 2D motion. By this way, the ego-motion estimation is decoupled into a 2D parametric motion and residual epipolar parallax displacements, which avoids many of the inherent ambiguities and instabilities associated with decomposing the image motion into its rotational and translational components, and hence makes the computation of ego-motion or 3D structure estimation more robust. Based on the ego-motion estimation, an adaptive control law for visual regulation of nonholonomic mobile robot is presented and the stability of the close loop system is analyzed in the sense of Lyapunov stability theory. Experiments show that the convergence of the proposed visual regulation.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.