Abstract

In this study, we propose an improved view-based navigation method for obstacle avoidance and evaluate the effectiveness of the method in real environments with real obstacles. The proposed method possesses the ability to estimate the position and rotation of a mobile robot, even if the mobile robot strays from a recording path for the purpose of avoiding obstacles. In order to achieve this, ego-motion estimation was incorporated into the existing view-based navigation system. The ego-motion is calculated from SURF points between a current view and a recorded view using a Kinect sensor. In conventional view-based navigation systems, it is difficult to generate alternate paths to avoid obstacles. The proposed method is anticipated to allow a mobile robot greater flexibility in path planning to avoid humans and objects expected in real environments. Based on experiments performed in an indoor environment using a mobile robot, we evaluated the measurement accuracy of the proposed method, and confirmed its feasibility for robot navigation in museums and shopping mall.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call