Abstract
3D localization is an important process for many types of applications such as augmented reality or mobile robotics. To achieve this, outdoor applications converge towards using hybrid sensor systems. The combination of different types of sensors tends to overcome the drawbacks of using a single type of sensor and thus increases the robustness and the accuracy. However, the combination of several types of sensors must deal with various problems. Among them, the calibration represents an important step. Indeed, each sensor provides measurements in its own coordinate system. So, we need to convert the measurements provided by each sensor in its own coordinate systems into a reference coordinate system. This process consists in estimating transformations that maps one coordinate system to another. The accuracy of the hybrid sensor depends on the accuracy of this procedure. In our work, we are interested in using three types of sensors: a camera, an inertial sensor (IMU) and a GPS receiver. This hybrid sensor must provide at any time the position and the orientation of the camera point of views w.r.t. the world coordinate system (camera pose). Thus, orientations provided by the IMU and GPS positions must be re-expressed in the camera coordinate system. In this paper, we propose two calibration approaches based on models associated to each pair of sensors. Some results of experiments conducted under real conditions will also be presented.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.