Abstract

To validate the accuracy and reliability of onboard sensors for object detection and localization for driver assistance, as well as autonomous driving applications under realistic conditions (indoors and outdoors), a novel tracking system is presented. This tracking system is developed to determine the position and orientation of a slow-moving vehicle during test maneuvers within a reference environment (e.g., car during parking maneuvers), independent of the onboard sensors. One requirement is a 6 degree of freedom (DoF) pose with position uncertainty below 5 mm (3σ), orientation uncertainty below 0.3° (3σ), at a frequency higher than 20 Hz, and with a latency smaller than 500 ms. To compare the results from the reference system with the vehicle’s onboard system, synchronization via a Precision Time Protocol (PTP) and system interoperability to a robot operating system (ROS) are achieved. The developed system combines motion capture cameras mounted in a 360° panorama view setup on the vehicle, measuring retroreflective markers distributed over the test site with known coordinates, while robotic total stations measure a prism on the vehicle. A point cloud of the test site serves as a digital twin of the environment, in which the movement of the vehicle is visualized. The results have shown that the fused measurements of these sensors complement each other, so that the accuracy requirements for the 6 DoF pose can be met while allowing a flexible installation in different environments.

Highlights

  • Autonomous driving algorithms use different integrated sensors whose outputs are fused to provide information to control the vehicle

  • This application can be problematic due to the high number of cameras required for bigger measurement volumes, as well as the possible disturbance signals that are caused by sunlight reflections on surfaces

  • Insteadofofmarker-tagged the classical motion capture approach, involving monitoring of the the p motions objects from multiple anglesthe with static cameras, motions of marker-tagged objects from multiple angles with static cameras, the principle is movin is inverted and the cameras are placed with a panoramic view on top of the inverted and the cameras are placed with a panoramic view on top of the moving object

Read more

Summary

Introduction

Autonomous driving algorithms use different integrated sensors (camera, lidar, radar, etc.) whose outputs are fused to provide information to control the vehicle. The results showed high accuracy with errors in the sub-millimeter range This application can be problematic due to the high number of cameras required for bigger measurement volumes, as well as the possible disturbance signals that are caused by sunlight reflections on surfaces. The computationally intensive algorithms make it hard to implement this technique for real-time applications with high accuracy requirements This technology is identical to one of the vehicle sensors tested in this study, hindering the possibility of an independent assessment. To derive changes in position, double integration over time is necessary This measurement principle returns data at high rates ranging from 100 Hz up to kHz [19,20], but with significant drift. An IM complement other measurement systems with frequent measurements but would correction values to control the drift

Materials and Methods
Vantage
Physical Setup positions objects of additional objects and point
Physical Setup
A Precision
Logical
Visualization and Postprocessing
Requirement Coverage
Synchronization
MS60 Position influence
Comparison of Position
Accuracy of the Motion Capture System
Accuracy
11. Points
Y in mm in mm
Discussion
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call