Abstract

In this paper we tackle the problem of indoor robot localization by using a vision-based approach. Specifically, we propose a visual odometer able to give back the relative pose of an omnidirectional automatic guided vehicle (AGV) that moves inside an indoor industrial environment. A monocular downward-looking camera having the optical axis nearly perpendicular to the ground floor, is used for collecting floor images. After a preliminary analysis of images aimed at detecting robust point features (keypoints) takes place, specific descriptors associated to the keypoints enable to match the detected points to their consecutive frames. A robust correspondence feature filter based on statistical and geometrical information is devised for rejecting those incorrect matchings, thus delivering better pose estimations. A camera pose compensation is further introduced for ensuring better positioning accuracy. The effectiveness of proposed methodology has been proven through several experiments, in laboratory as well as in an industrial setting. Both quantitative and qualitative evaluations have been made. Outcomes have shown that the method provides a final positioning percentage error of 0.21% on an average distance of 17.2 m. A longer run in an industrial context has provided comparable results (a percentage error of 0.94% after about 80 m). The average relative positioning error is about 3%, which is still in good agreement with current state of the art.

Highlights

  • Accurate localization and robust navigation are essential for most of mobile robot applications.Over the last decades, strong attention has been given to develop novel methodologies and algorithms for achieving autonomous navigation

  • The effectiveness and the accuracy of proposed method have been evaluated through several experiments by considering different floor typologies

  • It should be noted that the experimental results reported in the table are referred to the authors data included in their work: in most cases, a direct comparison is unfeasible since they use different sensors and/or setup configurations

Read more

Summary

Introduction

Accurate localization and robust navigation are essential for most of mobile robot applications.Over the last decades, strong attention has been given to develop novel methodologies and algorithms for achieving autonomous navigation. Accurate localization and robust navigation are essential for most of mobile robot applications. An accurate pose estimation increases the capabilities of both robots/vehicles and autonomous systems in navigating properly inside the environment reducing the failure rate during missions. 4.0, is establishing new paradigms and concepts regarding to the robot navigation and localization, especially in logistics and transport [1]. The robots will not be any more constrained into limited working areas, but they will move and operate in different zones of factories and warehouses. In this regard, robust and reliable algorithms for localization purposes have to be devised

Methods
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.