Abstract

Due to the constantly increasing shortage of labour availability in the agricultural sector, automation is on the rise. This trend is particularly evident in field vegetable production, where the majority of costs are associated with human labour. To automate labour-intensive tasks, intelligent camera systems are required. Typical state of the art learning algorithms requires large amounts of labelled data for a reliable detection and segmentation. The effort for recording and labelling is increased by the diverse conditions on a field that need to be included in the datasets. In this paper, an unsupervised approach is presented, which unifies varying environmental conditions in the field, as well as an analysis to transfer this unpaired deep learning based unification to unknown exposure scenarios, different growth stages and other varieties. For this purpose, a Cycle-Consistent Adversarial Network (CycleGAN) is used which balance varying exposure and background situations to the training data variance without additional labelling effort. By adapting the exposure with a CycleGAN, the required labelling effort could be reduced by up to five times in relation to a heterogeneous dataset, as fewer variations need to be covered for a robust segmentation. The transfer of this approach to other growth stages, as well as the transfer to another cabbage cultivar shows the potential of additional data generalisation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call