The reliability of autonomous driving sensing systems impacts the overall safety of the driving system. However, perception system fault diagnosis is currently a weak area of research, with limited attention and solutions. In this paper, we present an information-fusion-based fault-diagnosis method for autonomous driving perception systems. To begin, we built an autonomous driving simulation scenario using PreScan software, which collects information from a single millimeter wave (MMW) radar and a single camera sensor. The photos are then identified and labeled via the convolutional neural network (CNN). Then, we fused the sensory inputs from a single MMW radar sensor and a single camera sensor in space and time and mapped the MMW radar points onto the camera image to obtain the region of interest (ROI). Lastly, we developed a method to use information from a single MMW radar to aid in diagnosing defects in a single camera sensor. As the simulation results show, for missing row/column pixel failure, the deviation typically falls between 34.11% and 99.84%, with a response time of 0.02 s to 1.6 s; for pixel shift faults, the deviation range is between 0.32% and 9.92%, with a response time of 0 s to 0.16 s; for target color loss, faults have a deviation range of 0.26% to 2.88% and a response time of 0 s to 0.05 s. These results prove the technology is effective in detecting sensor faults and issuing real-time fault alerts, providing a basis for designing and developing simpler and more user-friendly autonomous driving systems. Furthermore, this method illustrates the principles and methods of information fusion between camera and MMW radar sensors, establishing the foundation for creating more complicated autonomous driving systems.
Read full abstract