Orchard robots play a crucial role in agricultural production. Autonomous navigation serves as the foundation for orchard robots and eco-unmanned farms. Accurate sensing and localization are prerequisites for achieving autonomous navigation. However, current vision-based navigation solutions are sensitive to environmental factors, such as light, weather, and background, which can affect positioning accuracy. Therefore, they are unsuitable for outdoor navigation applications. LIDAR provides accurate distance measurements and is suitable for a wide range of environments. Its immunity to interference is not affected by light, colour, weather, or other factors, making it suitable for low objects and complex orchard scenes. Therefore, LiDAR navigation is more suitable for orchard environments. In complex orchard environments, tree branches and foliage can cause Global Positioning System (GNSS) accuracy to degrade, resulting in signal loss. Therefore, the major challenge that needs to be addressed is generating navigation paths and locating the position of orchard robots. In this paper, an improved method for Simultaneous Localization and Mapping (SLAM) and A-star algorithm is proposed. The SLAM and path planning method designed in this study effectively solves the problems of insufficient smoothness and large curvature fluctuation of the path planned in the complex orchard environment, and improves the detection efficiency of the robot. The experimental results indicate that the method can consistently and accurately fulfil the robot's detection needs in intricate orchard environments.
Read full abstract