Abstract

Strawberry detection and ripeness classification are important concerns in robotic harvesting, as precondition for efficient and nondestructive picking. However, the different sizes and overlap of strawberries complicate detection, and research on strawberry ripeness classification is lacking. Therefore, this study proposes the red color ratio as a new parameter for strawberry ripeness quantification. This parameter considers the proportion of the strawberry red area as a key factor. First, an adaptive strawberry feature augmentation network (ASFA-net) is proposed to generate masks for the strawberries. ASFA-net uses the shifted window (Swin) Transformer as the backbone network for strawberry feature extraction and employs a feature pyramid network along with the proposed strawberry feature adaptive fusion module, to augment the features. The proposed decoupled head network is then applied to generate the final results. Second, the red region within the mask of ripe strawberries is segmented based on hue, saturation, and value (HSV) to calculate the proportion of the red area of individual strawberries. ASFA-net was verified on a home-made strawberry dataset. The results show that ASFA-net can detect strawberries accurately and efficiently with mean average precision of 95.91 ± 0.64 % and mean intersection over union of 90.15 ± 1.49 %. The strawberry ripeness classification method exhibited good performance with an accuracy of 95.94 ± 1.40 %, false positive rate of 3.09 ± 0.46 %, and false negative rate of 4.22 ± 1.68 %. The purpose of this study is to establish a method for simultaneous strawberry detection and ripeness classification in a facility environment, to provide a new reference method for the vision system of robots.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call