Abstract

This paper presents a vision based autonomous navigation system for mobile robots in an indoor environment by teaching and playing-back scheme. The system uses an omnidirectional image sensor to perceive the environment, and extracts vertical edges as feature lines. The system memorizes a sequence of environmental images and robot's poses during teaching stage. In the course of playback navigation, the system calculates the robot's position difference from the memorized and currently taken images, and then decides the trajectory to track the taught route. The detail algorithm and the effectiveness of this method with experiments are shown.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call