Abstract

Accurate positioning of fruit is a key issue that has attracted much attention in the field of harvest robots. The complex environment and close proximity make the perception of dense crops in greenhouses a challenging problem. Different from various solutions proposed involving special equipment or other auxiliary information, we propose a novel positioning approach based on instance segmentation using a monocular RGB camera. To achieve high position accuracy, we first design a deep convolutional neural network (CNN) in a multi-task framework to export a binary segmentation map and an embedded feature map. To solve the problem of performance degradation in the intersection-over-union (IoU) for the binary segmentation task caused by multi-task optimisation, the encoder part of our network is redesigned on the basis of a Visual Geometry Group network with 16 convolutional layers (VGG-16). Then, mean-shift clustering is used to achieve instance segmentation. Finally, a contour-finding algorithm is presented for outlining fruit without the help of any contextual information. Based on these contours, the five fruits with the largest contour areas are selected as the targets for positioning. We verify our method on a public sweet pepper dataset and achieve competitive results. Divided by the radius of the fruit, the average position error for the first target in the harvesting order is 0.18, which shows that our method outperforms the semantic segmentation method. For the first five targets in the harvesting order, this index is less than 0.3 on average, similar to that of the semantic segmentation method with only one target output.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.