Abstract
The extraction of navigation lines plays a crucial role in the autonomous navigation of agricultural robots. This work offers a method of ridge navigation route extraction, based on deep learning, to address the issues of poor real-time performance and light interference in navigation path recognition in a field environment. This technique is based on the Res2net50 model and incorporates the Squeeze-and-Excitation Networks (SE) attention mechanism to focus on the key aspects of the image. The empty space pyramid pooling module is presented to further extract high-level semantic data and enhance the network’s capacity for fine-grained representation. A skip connection is used to combine the high-level semantic characteristics and low-level textural features that are extracted. The results of the ridge prediction are then obtained, followed by the realization of the final image segmentation, through sampling. Lastly, the navigation line is fitted once the navigation feature points have been retrieved using the resulting ridge segmentation mask. The outcomes of the experiment reveal that: the Mean Intersection over Union (MIOU) and F-measure values of the inter-ridge navigation path extraction approach suggested in this paper are increased by 0.157 and 0.061, respectively, compared with the Res2net50 network. Under various illumination situations, the average pixel error is 8.27 pixels and the average angle error is 1.395°. This technique is appropriate for ridge operations and can successfully increase network prediction model accuracy.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.