Abstract
With the introduction of concepts such as ubiquitous mapping, mapping-related technologies are gradually applied in autonomous driving and target recognition. There are many problems in vision measurement and remote sensing, such as difficulty in automatic vehicle discrimination, high missing rates under multiple vehicle targets, and sensitivity to the external environment. This paper proposes an improved RES-YOLO detection algorithm to solve these problems and applies it to the automatic detection of vehicle targets. Specifically, this paper improves the detection effect of the traditional YOLO algorithm by selecting optimized feature networks and constructing adaptive loss functions. The BDD100K data set was used for training and verification. Additionally, the optimized YOLO deep learning vehicle detection model is obtained and compared with recent advanced target recognition algorithms. Experimental results show that the proposed algorithm can automatically identify multiple vehicle targets effectively and can significantly reduce missing and false rates, with the local optimal accuracy of up to 95% and the average accuracy above 86% under large data volume detection. The average accuracy of our algorithm is higher than all five other algorithms including the latest SSD and Faster-RCNN. In average accuracy, the RES-YOLO algorithm for small data volume and large data volume is 1.0% and 1.7% higher than the original YOLO. In addition, the training time is shortened by 7.3% compared with the original algorithm. The network is then tested with five types of local measured vehicle data sets and shows satisfactory recognition accuracy under different interference backgrounds. In short, the method in this paper can complete the task of vehicle target detection under different environmental interferences.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.