Abstract

Recently, autonomous driving technology has been in the spotlight. However, autonomous driving is still in its infancy in the railway industry. In the case of railways, there are fewer control elements than autonomous driving of cars due to the characteristics of running on railways, but there is a disadvantage in that evasive maneuvers cannot be made in the event of a dangerous situation. In addition, when braking, it cannot be decelerated quickly for the weight of the body and the safety of the passengers. In the case of a tram, one of the railway systems, research has already been conducted on how to generate a profile that plans braking and acceleration as a base technology for autonomous driving, and to find the location coordinates of surrounding objects through object recognition. In pilot research about the tram's automated driving, YOLOv3 was used for object detection to find object coordinates. YOLOv3 is an artificial intelligence model that finds coordinates, sizes, and classes of objects in an image. YOLOv3 is the third upgrade of YOLO, which is one of the most famous object detection technologies based on CNN. YOLO's object detection performance is characterized by ordinary accuracy and fast speed. For this paper, we conducted a study to find out whether the object detection performance required for autonomous trams can be sufficiently implemented with the already developed object detection model. For this experiment, we used the YOLOv4 which is the fourth upgrade of YOLO.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.