Field path planning provides the basis for the autonomous navigation of agricultural vehicles. Existing path planning approaches are constrained to a 2D plane, disregarding the impact of terrain factors on navigation tasks. Furthermore, these methods generate non-global paths, as evidenced by their inability to accommodate the headland turning area while performing path planning based on the actual crop growth in the farmland. Low-altitude remote sensing technology, characterized by its high spatial resolution, timely data acquisition, and strong terrain perception, holds great potential for application in autonomous navigation. This study aims to integrate low-altitude remote sensing technology with path-planning tasks for agricultural vehicles by constructing oblique photography models of fields. The objective is to achieve global 3D path planning leveraging advanced methodologies rooted in deep learning and image processing. Four main steps were included in the proposed method. Low-altitude remote sensing models of field blocks were constructed first. Secondly, the models were converted to image patches for dataset establishment. Thirdly, primary crop row identification was conducted by semantic segmentation. Finally, global 3D paths covering the farmland and headland areas were generated. Tea fields in hilly areas were used to test the algorithm. Experiments revealed that the proposed method could be adapted to different field shapes, row numbers, row spacing, and vehicle turning radii. The proposed method uses deep learning and image processing as the primary technical tools but goes beyond traditional crop row detection to form a global path-planning strategy. In addition, the elevation information enabled by the 3D detection manner allows agricultural vehicles to gain a comprehensive understanding of both their own position and that of their destination. The dataset and the code are available at: https://github.com/ZeroHeading/Global-3D-path-generation.git.
Read full abstract