Abstract

An autonomous navigation system relies on a number of sensors including radar, LIDAR and a visible light camera for its operation. We focus our attention on the visible light camera in this work. Object detection is the key first step to processing the video input from the camera. Specifically, we address the problem of assessing the performance of object detection algorithms in hazardous driving conditions that an autonomous navigation system is expected to encounter in a realistic scenario. To this end, we propose a novel metric for quantifying the degradation in performance of an object detection algorithm under different weather conditions. Additionally’ we introduce a real-time method to detect extreme variations in performance of the algorithm that can be used to issue an alert. We evaluate the performance of our metric and alerting system and demonstrate its utility using the YOLOv2 object detection algorithm trained on the KITTI and virtual KITTI dataset.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call