Abstract

Accurate identification of strawberries at different growth stages as well as determination of optimal picking points by strawberry picking robots is a key issue in the field of agricultural automation. In this paper, a fast detection method of strawberry ripeness and picking point based on improved YOLO V8-Pose (You Only Look Once) and RGB-D depth camera is proposed to address this problem. By comparing the YOLO v5-Pose, YOLO v7-Pose, and YOLO v8-Pose models, it is determined to use the YOLO v8-Pose model as the fundamental model for strawberry ripeness and picking point detection. For the sake of further improving the accuracy of the model detection, this paper makes targeted improvements: all the Concat modules at the Neck part are replaced with BiFPN richer feature fusion, which enhances the global feature extraction capability of the model; the MobileViTv3 framework is employed to restructure the backbone network, thereby augmenting the model's capacity for contextual feature extraction. Subsequently, the output-side CIoU loss function is supplanted with the SIoU loss function, leading to an acceleration in the model's convergence. The enhanced YOLO v8-Pose demonstrates a 97.85% mAP-kp value, reflecting a 5.49% improvement over the initial model configuration.. For the sake of accurately localizing the three-dimensional information of strawberry picking points, the strawberry picking points are further projected into the corresponding depth information to obtain their three-dimensional information. The experimental results show that the mean absolute error and the mean absolute percentage error of strawberry picking point localization in this paper are 0.63 cm and 1.16%, respectively. In this study, we introduce a method capable of concurrently detecting strawberry maturity and identifying the precise harvesting location while accurately localizing the picking point. This investigation holds considerable theoretical and pragmatic relevance in augmenting the intelligence of strawberry harvesting robots and actualizing automation and smart capabilities in agricultural production.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.