Abstract

This paper presents an improved Faster R–CNN model for a field robot platform (FRP) aimed at automatically extracting image features and quickly and accurately detecting maize seedlings during different growth stages under complex field operation environments, with the goal of preparing for intelligent inter-tillage in maize fields. A FRP with five industrial USB cameras for data collection was used to capture a large number of sample images. The shooting angle range of the industrial USB cameras is 0–90°. The photographs were used to create an image database containing twenty thousand images of soil, maize and weeds. Ten selected pretrained networks were used to replace the network of the CNN feature computing component of the classic Faster R–CNN. A Faster R–CNN with VGG19 processed by the pretrained networks method is proposed. The Faster R–CNN algorithm used in this work represents a deep learning architecture that distinguishes maize seedlings and weeds under three field conditions: Full-cycle, Multi-weather and Multi-angle. This work achieved greater than 97.71% precision in the detection of maize seedlings with respect to soil and weeds. The precision rate of six-leaf to seven-leaf maize seedlings was 2.74% lower than that of the total test set. The precision rate under sunny conditions was 1.97% lower than that of the total test set. The precision rate of an angle shot of 0° was 0.95% lower than that of the total test set. The proposed model has significant potential for autonomous weed and maize classification under actual operating conditions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call