Abstract

Apple fruits on trees tend to swing because of wind or other natural causes, therefore reducing the accuracy of apple picking by robots. To increase the accuracy and to speed up the apple tracking and identifying process, tracking and recognition method combined with an affine transformation was proposed. The method can be divided into three steps. First, the initial image was segmented by Otsu’s thresholding method based on the two times Red minus Green minus Blue (2R-G-B) color feature; after improving the binary image, the apples were recognized with a local parameter adaptive Hough circle transformation method, thus improving the accuracy of recognition and avoiding the long, time-consuming process and excessive fitted circles in traditional Hough circle transformation. The process and results were verified experimentally. Second, the Shi-Tomasi corners detected and extracted from the first frame image were tracked, and the corners with large positive and negative optical flow errors were removed. The affine transformation matrix between the two frames was calculated based on the Random Sampling Consistency algorithm (RANSAC) to correct the scale of the template image and predict the apple positions. Third, the best positions of the target apples within 1.2 times of the prediction area were searched with a de-mean normalized cross-correlation template matching algorithm. The test results showed that the running time of each frame was 25 ms and 130 ms and the tracking error was more than 8% and 20% in the absence of template correction and apple position prediction, respectively. In comparison, the running time of our algorithm was 25 ms, and the tracking error was less than 4%. Therefore, test results indicate that speed and efficiency can be greatly improved by using our method, and this strategy can also provide a reference for tracking and recognizing other oscillatory fruits. Keywords: apple picking robot, tracking and recognition algorithm, oscillating apple, Hough transform, pyramid LK optical flow algorithm, affine transform, template matching DOI: 10.25165/j.ijabe.20201305.5520 Citation: Yang Q H, Chen C, Dai J Y, Xun Y, Bao G J. Tracking and recognition algorithm for a robot harvesting oscillating apples. Int J Agric & Biol Eng, 2020; 13(5): 163–170.

Highlights

  • Because manual picking for large-scale apple harvesting is time-consuming and labor-intensive, scientists are developing apple-picking robots, for which the machine’s vision capability is one of the most important aspects

  • Workers need to wait for the apples to be still or to image repeatedly in order to accurately identify and locate apples, significantly reducing the picking efficiency. In view of these problems, Zhao et al.[23] and Lyu et al.[24] used a template matching method to track the apple position based on the information association of the initial and later images, but it was reported the extracted templates could not cope with the changes of image scale and angle during the camera’s approach to the apple, resulting in large errors in tracking

  • 3.1 Template and corner extraction In order to track the position of the apple, the rectangular region of the target fruit was extracted as the template image in a 2R-G-B image

Read more

Summary

Introduction

Because manual picking for large-scale apple harvesting is time-consuming and labor-intensive, scientists are developing apple-picking robots, for which the machine’s vision capability is one of the most important aspects. Workers need to wait for the apples to be still or to image repeatedly in order to accurately identify and locate apples, significantly reducing the picking efficiency. In view of these problems, Zhao et al.[23] and Lyu et al.[24] used a template matching method to track the apple position based on the information association of the initial and later images, but it was reported the extracted templates could not cope with the changes of image scale and angle during the camera’s approach to the apple, resulting in large errors in tracking. The template matching was used around the prediction area to track the position of the apple

Apple recognition
Apple target fast tracking
Experimental results and analysis
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call