Abstract

In order to achieve accurate UAV-based wheat ear counting, a transfer learning method of the ground-based fully convolutional network, i.e., EarDensityNet, was proposed in this study. The EarDensityNet, which integrated the filter pyramid block and dilated convolution, was designed to map the wheat canopy images to ear density maps generated by dot annotations. The wheat ear counting can be obtained by summing all the pixel values of the corresponding ear density map. Results showed strong correlations could be observed between the actual number of wheat ears to those estimated by the EarDensityNet, with high coefficient of determination (R2 = 0.9179) and low Root-Mean-Square-Error (RMSE = 17.61 ears, NRMSE = 4.47%), outperforming the compared methods. Ground resolution of canopy images had a significant impact on the performance of the EarDensityNet. Transfer learning of the ground-based EarDensityNet could take full advantage of the rich details presented by the ground-based images with high pixel resolution, thus effectively alleviating the degradation of counting performance caused by the decreased ground resolution. Therefore, obtained results showed the fine-tuned EarDensityNet more accurate UAV-based wheat ear counting (R2 = 0.9570, RMSE = 801.34, and NRMSE = 22.06%) than one learned from scratch, demonstrating the superiority and applicability. Border effect from splitting digital images with high pixel resolution into sub-images did not make a major problem to the EarDensityNet, demonstrating great potentials to be generalized from plot-wise to field-wise. Wheat ear counting was recommended after the flowering stage since the textures of wheat ears were more obvious for EarDensityNet to learn complex feature representations.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.