Abstract
There has been a global increase in the number of vehicles in use, resulting in a higher occurrence of traffic accidents. Advancements in computer vision and deep learning enable vehicles to independently perceive and navigate their environment, making decisions that enhance road safety and reduce traffic accidents. Worldwide accidents can be prevented in both driver-operated and autonomous vehicles by detecting living and inanimate objects such as vehicles, pedestrians, animals, and traffic signs in the environment, as well as identifying lanes and obstacles. In our proposed system, road images are captured using a camera positioned behind the front windshield of the vehicle. Computer vision techniques are employed to detect straight or curved lanes in the captured images. The right and left lanes within the driving area of the vehicle are identified, and the drivable area of the vehicle is highlighted with a different color. To detect traffic signs, pedestrians, cars, and bicycles around the vehicle, we utilize the YOLOv5 model, which is based on Convolutional Neural Networks. We use a combination of study-specific images and the GRAZ dataset in our research. In the object detection study, which involves 10 different objects, we evaluate the performance of five different versions of the YOLOv5 model. Our evaluation metrics include precision, recall, precision-recall curves, F1 score, and mean average precision. The experimental results clearly demonstrate the effectiveness of our proposed lane detection and object detection method.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.