Abstract

Estimation of phenology distribution in horticultural crops is very important as it governs the timing of chemical thinning in order to produce good quality fruit. This paper presents a novel phenology distribution estimation method named DeepPhenology for apple flowers based on CNNs using RGB images, which is able to efficiently map the flower distribution on an image-level, row-level, and block-level. The image classification model VGG-16 was directly trained with relative phenology distributions calculated from manual counts of flowers in the field and acquired imagery. The proposed method removes the need to label images, which overcomes difficulties in distinguishing overlapping flower clusters or identifying hidden flower clusters when using 2D imagery. DeepPhenology was tested on both daytime and night-time images captured using an RGB camera mounted on a ground vehicle in both Gala and Pink Lady varieties in an Australian orchard. An average Kullback-Leibler (KL) divergence value of 0.23 over all validation sets and an average KL value of 0.27 over all test sets was achieved. Further evaluation has been done by comparing the proposed model with YOLOv5 and shown to outperform this state-of-the-art object detection model for this task. By combining relative phenology distributions from a single image to a row-level or block-level distribution, we are able to give farmers a precise and high-level overview of block performance to form the basis for decisions on chemical thinning applications.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call