Abstract
SUMMARY In this research, estimating the position and rotation of a mobile robot outside of a recording path is realized by applying ego-motion to view-based navigation. The ego-motion is calculated based on the differences in 3D positions of SURF feature points between recording and current images obtained by a Kinect sensor. In conventional view-based navigation, it is difficult to plan another path when people and objects are found in the recording path. By using the authors’ proposed estimation method, it is possible to realize flexible path planning in actual environments that include people and objects. Based on the results of experiments performed in actual indoor environments, the authors evaluated measurement accuracy for the robot's position and rotation estimated under their method, and confirmed the viability of their method for actual environments including people and objects.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEJ Transactions on Electronics, Information and Systems
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.