Abstract
The real-time and high-precision detection methods on embedded platforms are critical for harvesting robots to accurately locate the position of the table grapes. A novel detection method (ESP-YOLO) for the table grapes in the trellis structured orchards is proposed to improve the detection accuracy and efficiency based on You Only Look Once (YOLO), Efficient Layer Shuffle Aggregation Networks (ELSAN), Squeeze-and-Excitation (SE), Partial Convolution (PConv) and Soft Non-maximum suppression (Soft_NMS). According to cross-group information interchange, the channel shuffle operation is presented to modify transition layers instead of the CSPDarkNet53 (C3) in backbone networks for the table grape feature extraction. The PConv is utilised in the neck network to extract the part channel's features for the inference speed and spatial features. SE is inserted in backbone networks to adjust the channel weight for channel-wise features of grape images. Then, Soft_NMS is modified to enhance the segmentation capability for densely clustered grapes. The algorithm is conducted on embedded platforms to detect table grapes in complex scenarios, including the overlap of multi-grape adhesion and the occlusion of stems and leaves. ELSAN block boosts inference speed by 46% while maintaining accuracy. The mAP@0.5:0.95 of ESP-YOLO surpasses that of other advanced methods by 3.7%–16.8%. ESP-YOLO can be a useful tool for harvesting robots to detect table grapes accurately and quickly in various complex scenarios.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Similar Papers
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.