Abstract

We present an open hardware and software platform to efficiently fuse heterogeneous sensor data in an automotive/robotic context. The framework presented in this paper provides researchers a base platform in order to develop and evaluate sensor fusion strategies. In contrast to similar approaches, this framework exploits in particular the raw radar data and enables the fusion at low-level. The proposed system utilizes low-level data from radar sensors as well as indirect (e.g. 3D imaging) and direct (e.g. LIDAR) Time-of-Flight (ToF) sensors. After a configurable amount of pre-processing at sensor-level, the sensor data is transferred to a centralized platform and aligned temporally and spatially. We demonstrate the transformation of radar data into the 3D coordinate system in order to fuse it with point cloud data from ToF sensors. Due to the modular structure of the framework, it also enables the exploration of various system partitioning concepts.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call