Abstract

In order to improve the ability of intelligent vehicle to accurately recognize the semantics of the front vehicle, a new method of lamp signal recognition is proposed by combining deep learning with traditional computer vision. Firstly, the method uses YOLOv4 (You Only Look Once) network to detect vehicles and obtain accurate vehicle tail areas; Then, according to the spatial distribution characteristics of the pixels lighting the lamp in hue, saturation, value (HSV), a HSV spatial segmentation method based on region adaptive threshold is proposed to improve the pixel extraction quality; Finally, the deep neural network model is established to train the collected sample data, classify the brake lamp, turn signal lamp and lamp off state according to the information of lighting lamp pixels, and infer the current lamp meaning of the vehicle in front. In this paper, Python3.8, Pytorch1.9 and Opencv3.2 are used as algorithm implementation tools to test the road in the daytime urban traffic scene. The experimental results show that the average accuracy of the algorithm is 81.3%, the accuracy of the left turn signal is 75.8%, and the accuracy of the right turn signal is 76.4%.Article HighlightsThis paper investigates vehicle lamp signal recognition that is rarely explored in intelligent vehicle perception. A convolutional neural network is trained on the autonomous driving dataset for detection of vehicle tail regions.This paper transforms the image color space and proposes a region-based adaptive threshold for semantic segmentation of vehicle taillights with different hue, saturation and value.This paper constructs a deep neural network for lamp signal classification. Combined with the data collected in the actual traffic scene and the optimized network structure, the judgment of turn signals and brake signals is completed.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.