Abstract

Nowadays, radar sensor networks are often used for modern driver assistance systems in order to generate robust and precise environmental representations. The quality of the environment mapping depends significantly on the accurate knowledge of the mounting position and orientation of the radar sensors. In particular, both the relative orientation and positioning with respect to the other sensors are of great importance. In this paper, an algorithm is presented which performs an estimation of the relative orientation around all three rotation axes as well as the relative position in all three dimensions of distributed radar sensors. For this purpose, only an overlapping field of view (FoV) of the sensors as well as a moving point target, e.g. a corner reflector, which moves along an arbitrary trajectory without the need to know its position and velocity, is required. The estimation accuracy verified with the help of measurements is up to 4 cm for the position and 0.35 <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">◦</sup> for the orientation in 3D space.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call