Abstract
Lotus pods in unstructured environments often present multi-scale characteristics in the captured images. As a result, it makes their automatic identification difficult and prone to missed and false detections. This study proposed a lightweight multi-scale lotus pod identification model, MLP-YOLOv5, to deal with this difficulty. The model adjusted the multi-scale detection layer and optimized the anchor box parameters to enhance the small object detection accuracy. The C3 module with transformer encoder (C3-TR) and the shuffle attention (SA) mechanism were introduced to improve the feature extraction ability and detection quality of the model. GSConv and VoVGSCSP modules were adopted to build a lightweight neck, thereby reducing model parameters and size. In addition, SIoU was utilized as the loss function of bounding box regression to achieve better accuracy and faster convergence. The experimental results on the multi-scale lotus pod test set showed that MLP-YOLOv5 achieved a mAP of 94.9%, 3% higher than the baseline. In particular, the model’s precision and recall for small-scale objects were improved by 5.5% and 7.4%, respectively. Compared with other mainstream algorithms, MLP-YOLOv5 showed more significant advantages in detection accuracy, parameters, speed, and model size. The test results verified that MLP-YOLOv5 can quickly and accurately identify multi-scale lotus pod objects in complex environments. It could effectively support the harvesting robot by accurately and automatically picking lotus pods.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.