Abstract

In the context of autonomous driving, sensing systems play a crucial role, and their accuracy and reliability can significantly impact the overall safety of autonomous vehicles. Despite this, fault diagnosis for sensing systems has not received widespread attention, and existing research has limitations. This paper focuses on the unique characteristics of autonomous driving sensing systems and proposes a fault diagnosis method that combines hardware redundancy and analytical redundancy. Firstly, to ensure the authenticity of the study, we define 12 common real-world faults and inject them into the nuScenes dataset, creating an extended dataset. Then, employing heterogeneous hardware redundancy, we fuse MMW radar, LiDAR, and camera data, projecting them into pixel space. We utilize the “ground truth” obtained from the MMW radar to detect faults on the LiDAR and camera data. Finally, we use multidimensional temporal entropy to assess the information complexity fluctuations of LiDAR and the camera during faults. Simultaneously, we construct a CNN-based time-series data multi-classification model to identify fault types. Through experiments, our proposed method achieves 95.33% accuracy in detecting faults and 82.89% accuracy in fault diagnosis on real vehicles. The average response times for fault detection and diagnosis are 0.87 s and 1.36 s, respectively. The results demonstrate that the proposed method can effectively detect and diagnose faults in sensing systems and respond rapidly, providing enhanced reliability for autonomous driving systems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call