Abstract

Lane detection is a challenging task due to problems like the diversity of lanes, occlusion, dazzle light, and so on. We believe that two factors are helpful to solve the above problems and therefore improve detection performance, including global context dependency and effective feature representation focusing on important feature channels. In this work, we propose an instance segmentation approach and develop a novel dual attention network DALane- Net for real-time lane detection. The network leverages the spatial attention and channel attention mechanism to achieve better feature representational power. The spatial attention learns position relation between pixels while channel attention learns the importance of feature channel characteristics. The usage of the dual attention mechanism strengthens the lane line feature representation and effectively solves occlusion, lighting issues. Furthermore, to address the issues that the segmentation-based network usually suffers a low running speed due to complex convolution calculation, we propose more efficient convolution structures to improve the computational efficiency. Substantial experiments were conducted on the TuSimple lane marking database. The results demonstrate that DALaneNet is robust in various illumination conditions and especially effective in the case of occlusion. Compared with the existing state-of-the-art methods, the DALaneNet exhibits integrated advantages in terms of detection accuracy and speed.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.