Abstract

Autonomous vehicles necessitate the integration of advanced technologies such as computer vision and deep learning to comprehend and navigate their surroundings. A crucial yet challenging component of this integration is the accurate detection of lanes, which can be influenced by a multitude of varying lane characteristics and conditions. This research undertakes a comparative analysis of lane detection methodologies, explicitly focusing on traditional image processing techniques and Convolutional Neural Networks (CNNs). The evaluation utilized a sample of 500 images from the CULane dataset, which encompasses a diverse range of traffic scenarios. Initially, a method incorporating Gaussian blurring, Canny edge detection, and Hough line transformation was examined. Despite its efficiency, operating at 30 frames per second, this approach exhibited a high error rate (average Mean Squared Error (MSE) of 0.537), which is attributable to the loss of critical image details during the preprocessing stage. Subsequently, the performance of a fine-tuned YOLOv8 model, trained on a reformatted version of the CULane dataset was assessed. The combination of object detection and subsequent Hough transformation yielded high accuracy, demonstrating the model’s ability to learn and identify relevant lane features. The deep CNNs demonstrated superior performance over classical image processing techniques in terms of lane detection accuracy, thereby underscoring their potential applicability within the realm of autonomous vehicle technology

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.