Abstract

Tea shoot detection and localization are highly challenging tasks because of varying illumination, inevitable occlusion, tiny targets, and dense growth. To achieve the automatic plucking of tea shoots in a tea garden, a reliable algorithm based on red, green, blue-depth (RGB-D) camera images was developed to detect and locate tea shoots in fields for tea harvesting robots. In this study, labeling criteria were first established for the images collected for multiple periods and varieties in the tea garden. Then, a “you only look once” (YOLO) network was used to detect tea shoot (one bud with one leaf) regions on RGB images collected by an RGB-D camera. Additionally, the detection precision for tea shoots was 93.1% and the recall rate was 89.3%. To achieve the three-dimensional (3D) localization of the plucking position, 3D point clouds of the detected target regions were acquired by fusing the depth image and RGB image captured by an RGB-D camera. Then, noise was removed using point cloud pre-processing and the point cloud of the tea shoots was obtained using Euclidean clustering processing and a target point cloud extraction algorithm. Finally, the 3D plucking position of the tea shoots was determined by combining the tea growth characteristics, point cloud features, and sleeve plucking scheme, which solved the problem that the plucking point may be invisible in fields. To verify the effectiveness of the proposed algorithm, tea shoot localization and plucking experiments were conducted in the tea garden. The plucking success rate for tea shoots was 83.18% and the average localization time for each target was about 24 ms. All the results demonstrate that the proposed method could be used for robotic tea plucking.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.