Abstract

Bunch shape and berry size indicate the quality of table grapes and crucially affect their market value. Berry thinning is one of the most important tasks in grape cultivation to achieve an ideal bunch shape and to make sufficient space for individual berries. A successful practice by skilled grape farmers in Japan is using the number of berries in a bunch to guide the thinning process; hence, a technique for automatically counting the number of berries in a working bunch has been long desired by farmers to improve the efficiency of the thinning task. This research presents a novel end-to-end berry-counting technique based on a deep neural network (DNN), and its contributions are as follows. First, because a DNN requires massive training data, a novel data augmentation technique simulating the thinning process is proposed. Second, a new location-sensitive object detection model that integrates explicit location information and supplementary classification loss into a state-of-the-art instance segmentation model was proposed for detecting the number of berries in a working bunch with a high accuracy. Third, a set of features, together with their extraction algorithms, is designed for predicting the number of berries in a bunch (3D counting) using the berries detected on a single 2D image. Experiments using data collected from farmers’ grape-thinning process have been conducted to validate the accuracy and effectiveness of the proposed methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.