Abstract
The accurate estimation of wheat ear is of great significance to gain high yield for ensuring food security. The use of computer vision for wheat ear counting to improve the estimation efficiency and accuracy has been explored by many scholars. However, existing single-computer vision recognition methods have insufficient robustness and recognition of adherent wheat ears. This study mainly focused on to solve the problem of adherent wheat ears, and proposed a combined algorithm called “APW” that ensemble with alternating direction methods of multipliers (ADMM), Potts algorithm and Watershed to achieve fast recognition and counting of wheat ears. The images of wheat ears were collected under three different scenarios including growth stages, planting densities, and flight heights by using unmanned aerial vehicle (UAV), and the Potts model was used to recognise the images. The masked images of the wheat ear were converted into a greyscale image, and were layered using a greyscale gradient. Finally, the watershed algorithm was used to segment adherent wheat ears in the layered image, and the centroids were labelled and counted. The results shows that the best accuracy was obtained under low planting density scenario, with an R2 of 0.89 and a root mean square error (RMSE) of 3.72, suggesting that APW can handle scenarios with large background differences and low target adherence. These findings provides a new approach for efficient counting of wheat ears in UAV acquired images.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.