Abstract

Preceding vehicle detection and tracking at nighttime are challenging problems due to the disturbance of other extraneous illuminant sources coexisting with the vehicle lights. To improve the detection accuracy and robustness of vehicle detection, a novel method for vehicle detection and tracking at nighttime is proposed in this paper. The characteristics of taillights in the gray level are applied to determine the lower boundary of the threshold for taillights segmentation, and the optimal threshold for taillight segmentation is calculated using the OTSU algorithm between the lower boundary and the highest grayscale of the region of interest. The candidate taillight pairs are extracted based on the similarity between left and right taillights, and the non-vehicle taillight pairs are removed based on the relevance analysis of vehicle location between frames. To reduce the false negative rate of vehicle detection, a vehicle tracking method based on taillights estimation is applied. The taillight spot candidate is sought in the region predicted by Kalman filtering, and the disturbed taillight is estimated based on the symmetry and location of the other taillight of the same vehicle. Vehicle tracking is completed after estimating its location according to the two taillight spots. The results of experiments on a vehicle platform indicate that the proposed method could detect vehicles quickly, correctly and robustly in the actual traffic environments with illumination variation.

Highlights

  • Statistics in the EU, USA and China indicate that the rear-end collisions are a principal cause of accidents on the highway, and the risk of the traffic accidents at night is greater than that in the daytime [1,2,3]

  • Because the rules of pairing taillights are usually limited by fixed thresholds and removing the vehicle candidate that possesses the lesser similarity of two spots, the global rule-based vehicle detection may obtain some incorrect taillight pairs in some real traffic scenes

  • When the taillight is destroyed by other illuminant sources or reflectors, vehicle detection based on taillight pairing will fail

Read more

Summary

Introduction

Statistics in the EU, USA and China indicate that the rear-end collisions are a principal cause of accidents on the highway, and the risk of the traffic accidents at night is greater than that in the daytime [1,2,3]. These non-vehicle illuminant sources could cause many difficulties for detecting the actual vehicles in nighttime road scenes These vehicle detection methods do not consider the correlation of the same vehicle in the preceding and current frame, and their detection rates and robustness are poor. When the vehicle is not detected based on the matching characteristics, the estimated state predicted by Kalman filtering would be regarded as the vehicle position in the current frame This tracking method could reduce the false negative rate of vehicle detection effectively, but it is ineffective to reduce the false detection rate. To promote correctness and robustness, a novel monocular-vision-based night vehicle detection and tracking method, which relies on taillight characteristics, is presented in this paper.

Taillights Segmentation
Vehicle Detection Based on Taillight Pairing and Relevance Analysis
Taillight Pairing Based on Similarity Analysis
Removing the Non-Vehicle Taillight Pair Based on Relevance Analysis
Vehicle Tracking
Taillight Spot Extraction
Taillight Location Predication Based on Kalman Filtering
Taillight Spot Extraction Based on Relevance Analysis
Vehicle Location Estimation Based on Extracted Taillights
Experiment Evaluation
Experimental Platform
Experimental Result
System Performance
Findings
Conclusions and Future Work
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.