Abstract
The solution to the problem of road environmental perception is one of the essential prerequisites to realizing the autonomous driving of intelligent vehicles, and road lane detection plays a crucial role in road environmental perception. However, road lane detection in complex road scenes is challenging due to poor illumination conditions, the occlusion of other objects, and the influence of unrelated road markings. It also hinders the commercial application of autonomous driving technology in various road scenes. In order to minimize the impact of illumination factors on road lane detection tasks, researchers use deep learning (DL) technology to enhance low-light images. In this study, road lane detection is regarded as an image segmentation problem, and road lane detection is studied based on the DL approach to meet the challenge of rapid environmental changes during driving. First, the Zero-DCE++ approach is used to enhance the video frame of the road scene under low-light conditions. Then, based on the bilateral segmentation network (BiSeNet) approach, the approach of associate self-attention with BiSeNet (ASA-BiSeNet) integrating two attention mechanisms is designed to improve the road lane detection ability. Finally, the ASA-BiSeNet approach is trained based on the self-made road lane dataset for the road lane detection task. At the same time, the approach based on the BiSeNet approach is compared with the ASA-BiSeNet approach. The experimental results show that the frames per second (FPS) of the ASA-BiSeNet approach is about 152.5 FPS, and its mean intersection over union is 71.39%, which can meet the requirements of real-time autonomous driving.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.