Abstract

SUMMARY In this research, estimating the position and rotation of a mobile robot outside of a recording path is realized by applying ego-motion to view-based navigation. The ego-motion is calculated based on the differences in 3D positions of SURF feature points between recording and current images obtained by a Kinect sensor. In conventional view-based navigation, it is difficult to plan another path when people and objects are found in the recording path. By using the authors’ proposed estimation method, it is possible to realize flexible path planning in actual environments that include people and objects. Based on the results of experiments performed in actual indoor environments, the authors evaluated measurement accuracy for the robot's position and rotation estimated under their method, and confirmed the viability of their method for actual environments including people and objects.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call