Abstract

AbstractEarly‐stage rape seedling density is closely related to yield estimation, growth diagnosis, cultivated area statistics, and field management. Currently, manual sampling and counting, which are inefficient and inaccurate, are heavily relied upon to estimate rape seedling density. Computer vision techniques have emerged as a promising solution to the automation of this task, as digital images have become more commonplace. Farmland field environments, however, face many challenges, including scale variation, denseness, and background occlusion. An improved multi‐column convolutional neural network, called seedling rape density prediction network (SRDPNet), has been proposed in this study to resolve the issues related to accurate density estimation and counting of rape seedlings in complex farmland scenarios. Based on the multi‐column convolutional attention encoder, filters of different sizes are used to capture the basic feature of rape seedlings at various scales. The channel attention and position attention modules are introduced into branches to alleviate the impact of low counting accuracy caused by background error and growth state differences. The SRDPNet was validated using the seedling rapeseed plant counting (SRPC) dataset created in this study. The experimental results showed that the SRDPNet demonstrated high accurate counting performance for the SRPC dataset with a high coefficient of determination (R2 = 0.97396) and mean absolute error (MAE = 3.26, mean square error = 4.56), which are superior to that of the comparison method. SRDPNet can effectively solve the visual challenges of rape seedlings in complex farmland scenes and improve the robustness for complex visual variations.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.