In unstructured tea garden environments, accurate recognition and pose estimation of tea bud leaves are critical for autonomous harvesting robots. Due to variations in imaging distance, tea bud leaves exhibit diverse scale and pose characteristics in camera views, which significantly complicates the recognition and pose estimation process. This study proposes a method using an RGB-D camera for precise recognition and pose estimation of tea bud leaves. The approach first constructs an for tea bud leaves, followed by a dynamic weight estimation strategy to achieve adaptive pose estimation. Quantitative experiments demonstrate that the instance segmentation model achieves an mAP@50 of 92.0% for box detection and 91.9% for mask detection, improving by 3.2% and 3.4%, respectively, compared to the YOLOv8s-seg instance segmentation model. The pose estimation results indicate a maximum angular error of 7.76°, a mean angular error of 3.41°, a median angular error of 3.69°, and a median absolute deviation of 1.42°. The corresponding distance errors are 8.60 mm, 2.83 mm, 2.57 mm, and 0.81 mm, further confirming the accuracy and robustness of the proposed method. These results indicate that the proposed method can be applied in unstructured tea garden environments for non-destructive and precise harvesting with autonomous tea bud-leave harvesting robots.
Read full abstract