Abstract

Numerous object detection algorithms, such as Faster RCNN, YOLO and SSD, have been extensively applied to various fields. Both accuracy and speed of the algorithms have been significantly improved. However, as 6G technology develops, the detection effect for the small object detection task in intelligent autonomous transportation is not ideal. To strengthen the detection ability and performance of multiple scales, especially for small objects, this study proposed an object detection model based on multi-attention residual network (MA-ResNet). At first, residual network with spatial attention, channel attention, and self-attention were designed as MA-ResNet. Meanwhile, the dataset labels were smoothed. On this basis, the proposed MA-ResNet replaced the original feature extractor VGG-16 of Faster-RCNN. Different layers of MA-ResNet were extracted for feature pyramid construction. Moreover, the improved Faster-RCNN model for object detection based on MA-ResNet was formed. Furthermore, the effectiveness of the model was confirmed. The results demonstrate that MA-ResNet outperforms other feature extraction models with faster convergence speed, higher accuracy, and stronger small object classification accuracy. The improved Faster-RCNN model for object detection can effectively improve the network retrieval accuracy, performance and robustness, can exhibit satisfying adaptability for the targets of different scales in different scenarios such as vehicle identification, autonomous driving in intelligent autonomous transport system with 6G. Moreover, the study provides a certain reference for improving the effect of multiple scales, especially small object detection in intelligent transportation with 6G.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.