Timely determination of weed distributions in fields is crucial for the precise spraying of herbicides. This facilitates weed control while saving costs and protecting the environment. Existing weed detection strategies often rely on the utilization of numerous weed samples to train detection models directly, which presents challenges in situations involving limited weed samples. To address this issue, a novel weed detection strategy was proposed in this study to identify weeds accurately in fields with varying coverage levels. For this purpose, red–green–blue (RGB) images of maize fields with different weed coverage levels were captured via a vertical take-off and landing fixed-wing unmanned aerial vehicle (UAV). The UAV images were first mosaicked, and a new weed detection strategy was developed and assessed. In this process, the MeanShift segmentation method, coupled with the local variance (LV) segmentation evaluation function and the Otsu automatic classification method, was initially employed to extract vegetation areas. The you-only-look-once (YOLO) v5n model was subsequently improved and used to detect maize plants. Finally, weed mapping was achieved by removing the identified maize plants from the vegetation through overlay analysis. The evaluation of the proposed method via an external dataset yielded favorable weed detection results, with an R2 value of 0.96 and a root mean square error (RMSE) value of 3.08 % under the different weed coverage levels. Specifically, in addition to adjusting the activation function and the nonmaximum suppression method, the impacts of integrating various attention modules at different positions on the performance of the YOLO v5n model for maize plant detection were analyzed. Improving the YOLO v5n model by incorporating the efficient channel attention (ECA) module into the backbone of the original model and utilizing the Hardswish activation function is recommended. Overall, this study offers support for precise weed control.
Read full abstract