Abstract

Abstract. Estimating the pose of a mobile robotic platform is a challenging task, especially when the pose needs to be estimated in a global or local reference frame and when the estimation has to be performed while the platform is moving. While the position of a platform can be measured directly via modern tachymetry or with the help of a global positioning service GNSS, the absolute platform orientation is harder to derive. Most often, only the relative orientation is estimated with the help of a sensor mounted on the robotic platform such as an IMU, with one or multiple cameras, with a laser scanner or with a combination of any of those. Then, a sensor fusion of the relative orientation and the absolute position is performed. In this work, an additional approach is presented: first, an image-based relative pose estimation with frames from a panoramic camera using a state-of-the-art visual odometry implementation is performed. Secondly, the position of the platform in a reference system is estimated using motorized tachymetry. Lastly, the absolute orientation is calculated using a visual marker, which is placed in the space, where the robotic platform is moving. The marker can be detected in the camera frame and since the position of this marker is known in the reference system, the absolute pose can be estimated. To improve the absolute pose estimation, a sensor fusion is conducted. Results with a Lego model train as a mobile platform show, that the trajectory of the absolute pose calculated independently with four different markers have a deviation < 0.66 degrees 50% of the time and that the average difference is < 1.17 degrees. The implementation is based on the popular Robotic Operating System ROS.

Highlights

  • The precise estimation of position and orientation of robots or autonomous vehicles is becoming increasingly important

  • This work aims at image-based orientation determination for a variety of mobile platforms: mobile robots or the tip of a robotic arm, vehicle monitoring or mobile measurement platforms

  • The time offset can be determined via cross correlation of a common phenomenon, in this case it is the heading of the mobile platform

Read more

Summary

Introduction

The precise estimation of position and orientation of robots or autonomous vehicles is becoming increasingly important. The challenge is the highly precise estimation of these values in a kinematic system. Position can be determined in near real-time using various approaches such as motorized tachymeters indoors and outdoors or GNSS outdoors. Determining the exact orientation or orientation of these platforms is challenging. Conventional Visual SLAM/Visual Odometry approaches can determine position and orientation, but only relative to their initial position. In an unpublished research project of the Institute of Geomatics (IGEO) at the FHNW, the orientation of a platform is calculated with high accuracy using a motion capture system carried on board the mobile platform. The setup of this system is complex and scalable only to a limited extent

Objectives
Results
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.