Abstract
Due to the rising cost and decreasing availability of labor, manual picking is becoming an increasing challenge for apple growers. A targeted shake-and-catch apple harvesting technique is being developed at Washington State University to address this challenge. The performance and productivity of such a harvesting technique can be increased greatly if the shaking process is automated. The first step toward automated shaking is the detection and localization of branches in apple tree canopies. A branch detection method was developed in this work for apple trees trained in a formal, fruiting wall architecture using depth features and a Regions-Convolutional Neural Network (R-CNN). Microsoft Kinect v2 was used to acquire RGB images and pseudo-color images, as well as depth images in natural orchard environment. The R-CNN was composed of an improved AlexNet network and was trained to detect apple tree branches using integrated pseudo-color and depth images for improved detection accuracy. The average recall and accuracy from the Pseudo-Color Image and Depth (PCI-D) method were 92% and 86% respectively when the R-CNN confidence level of the pseudo-color image was 50%. For comparison, when using the Pseudo-Color Image (PCI) method (without depth images), these averages were only 86% and 81%, respectively. Furthermore, the average correlation coefficient (r) between the fitting curves for branch skeletons using the PCI-D method and the fitting curves for ground-truth images was 0.91—another indicator that the PCI-D method performs better than the PCI method. In addition, the average accuracy of branch detection increased with both the PCI method and PCI-D method, since the sensor was closer to the canopy. This study demonstrates the great potential for using depth features in branch detection and skeleton estimation to develop effective shake-and-catch apple harvesting machines for use in formally trained apple orchards.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.