Radar sensor networks are today widely used in the field of autonomous driving and for generating high-precision images of the environment. The accuracy of the environmental representation depends to a large extent on the accurate knowledge of the sensor's mounting orientation. Both the relative orientation of the sensors to each other and the relative sensor orientation in relation to the vehicle coordinate system are determining factors. For the first time, the orientation estimation of the radar sensors of a network is possible exclusively on the basis of radar target lists without additional localization and orientation devices such as an IMU or GNSS. In this work, two algorithms for determining the orientation of incoherently networked radar sensors with respect to the vehicle coordinate system and with respect to each other are derived and characterized. With the presented algorithms orientation accuracies up to <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"><tex-math notation="LaTeX">$0.25 \,\mathrm{^{\circ }}$</tex-math></inline-formula> are achieved. Furthermore, the algorithms do not impose any requirements on the positioning or the orientation of the radar sensors, such as overlapping field of views (FOVs) or the detection of identical targets. The presented algorithms are applicable to arbitrary driving trajectories as well as for point targets and extended targets which enables the use in regular road traffic.
Read full abstract