Abstract
Recent deep learning methods have allowed important steps forward in the automatic detection of wheat ears in the field. Nevertheless, it was still lacking a method able to both count and segment the ears, validated at all the development stages from heading to maturity. Moreover, the critical step of converting the ear count in an image to an ear density, i.e. a number of ears per square metre in the field, has been widely ignored by most of the previous studies. For this research, wheat RGB images have been acquired from heading to maturity in two field trials displaying contrasted fertilisation scenarios. An unsupervised learning approach on the YOLOv5 model, as well as the cutting-edge DeepMAC segmentation method were exploited to develop a wheat ear counting and segmentation pipeline that necessitated only a limited amount of labelling work for the training. An additional label set including all the development stages was built for validation. The average F1 score of ear bounding box detection was 0.93 and the average F1 score of segmentation was 0.86. To convert the ear counts to ear densities, a second RGB camera was used so that the distance between the cameras and the ears could be measured by stereovision. That distance was exploited to compute the image footprint at ear level, and thus divide the number of ears by this footprint to get the ear density. The obtained ear densities were coherent regarding the fertilisation scenarios but, for a same fertilisation, differences were observed between acquisition dates. This highlights that the measurement was not able to retrieve absolute ear densities for all the development stages and conditions. The deep learning measurement considered the most reliable outperformed observations from three human operators.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.