Abstract

In this work we introduce a relative localization method that estimates the coordinate frame transformation between two devices based on distance measurements. We present a linear algorithm that calculates the relative pose in 2D or 3D with four degrees of freedom (4-DOF). This algorithm needs a minimum of five or six distance measurements, respectively, to estimate the relative pose uniquely. We use the linear algorithm in conjunction with outlier detection algorithms and as a good initial estimate for iterative least squares refinement. The proposed method outperforms other related linear methods in terms of distance measurements needed and in terms of accuracy. In comparison with a related linear algorithm in 2D, we can reduce 10% of the translation error. In contrast to the more general 6-DOF linear algorithm, our 4-DOF method reduces the minimum distances needed from ten to six and the rotation error by a factor of four at the standard deviation of our ultra-wideband (UWB) transponders. When using the same amount of measurements the orientation error and translation error are approximately reduced to a factor of ten. We validate our method with simulations and an experimental setup, where we integrate ultra-wideband (UWB) technology into simultaneous localization and mapping (SLAM)-based devices. The presented relative pose estimation method is intended for use in augmented reality applications for cooperative localization with head-mounted displays. We foresee practical use cases of this method in cooperative SLAM, where map merging is performed in the most proactive manner.

Highlights

  • A considerable amount of research and progress has been made in the last few decades in the field of simultaneous localization and mapping (SLAM) [1,2]

  • We present a relative localization method based on distance measurements between two UWB/SLAM-based devices and a novel linear algorithm for estimating the unique 4-DOF relative pose with a minimum of six distance measurements

  • We can observe that fixing the Euclidean norm of the translation k~εk2 to the first distance measurement d1 reduces the error of the translation when the number of measured distances is small

Read more

Summary

Introduction

A considerable amount of research and progress has been made in the last few decades in the field of simultaneous localization and mapping (SLAM) [1,2]. SLAM methods concurrently map the surrounding three-dimensional space and estimate the own pose (position and orientation) relative to the constructed map. Cooperative SLAM (C-SLAM) is a related important topic with special relevance in robotics and it adds an additional layer of complexity, because it requires prior knowledge of the map or the relative poses (coordinate frame transformations). The relative poses are calculated either by using common information of local maps or by meeting (rendezvous) and performing relative line-of-sight (LOS). The need to scan common areas of the environment or to meet, especially if the relative poses are not known, results in a time loss, which is unacceptable in time-critical applications such as emergency rescue operations

Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.