Abstract

Vehicle safety promises to be one of the Advanced Driver Assistance System’s (ADAS) biggest benefits. Higher levels of automation remove the human driver from the chain of events that can lead to a crash. Sensors play an influential role in vehicle driving as well as in ADAS by helping the driver to watch the vehicle’s surroundings for safe driving. Thus, the driving load is drastically reduced from steering as well as accelerating and braking for long-term driving. The baseline for the development of future intelligent vehicles relies even more on the fusion of data from surrounding sensors such as Camera, LiDAR and Radar. These sensors not only need to perceive in clear weather but also need to detect accurately adverse weather and illumination conditions. Otherwise, a small error could have an incalculable impact on ADAS. Most of the current studies are based on indoor or static testing. In order to solve this problem, this paper designs a series of dynamic test cases with the help of outdoor rain and intelligent lightning simulation facilities to make the sensor application scenarios more realistic. As a result, the effect of rainfall and illumination on sensor perception performance is investigated. As speculated, the performance of all automotive sensors is degraded by adverse environmental factors, but their behaviour is not identical. Future work on sensor model development and sensor information fusion should therefore take this into account.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call