Abstract

Machine learning techniques have accelerated the development of autonomous navigation algorithms in recent years, especially algorithms for on-road autonomous navigation. However, off-road navigation in unstructured environments continues to challenge autonomous ground vehicles. Many off-road navigation systems rely on LIDAR to sense and classify the environment, but LIDAR sensors often fail to distinguish navigable vegetation from non-navigable solid obstacles. While other areas of autonomy have benefited from the use of simulation, there has not been a real-time LIDAR simulator that accounted for LIDAR–vegetation interaction. In this work, we outline the development of a real-time, physics-based LIDAR simulator for densely vegetated environments that can be used in the development of LIDAR processing algorithms for off-road autonomous navigation. We present a multi-step qualitative validation of the simulator, which includes the development of an improved statistical model for the range distribution of LIDAR returns in grass. As a demonstration of the simulator’s capability, we show an example of the simulator being used to evaluate autonomous navigation through vegetation. The results demonstrate the potential for using the simulation in the development and testing of algorithms for autonomous off-road navigation.

Highlights

  • Commonly referred to as LIDAR, are ubiquitous in off-road autonomous navigation because they provide a direct measurement of the geometry of the operating environment of the robot [1]

  • Up until now simulations have either lacked the fidelity to realistically capture LIDAR-vegetation interaction or been computationally slow and difficult to integrate with existing autonomy algorithms [8]

  • While there are several different types of LIDAR sensors, this paper focuses on incoherent micro-pulse LIDAR sensors, commonly referred to as Time-of-Flight (TOF) sensors, as this type of LIDAR is by far the most common type used in outdoor autonomous navigation

Read more

Summary

Introduction

Commonly referred to as LIDAR, are ubiquitous in off-road autonomous navigation because they provide a direct measurement of the geometry of the operating environment of the robot [1]. One of the ongoing issues with LIDAR perception is the inability of the sensor to distinguish between navigable obstacles like grass and non-navigable solid obstacles. This problem is stated clearly by [2]: Among the more pervasive and demanding requirements for operations in vegetation is the discrimination of terrain from vegetation-of rocks from bushes. Recent advances in simulation for robotics have demonstrated that autonomy algorithms can be developed and tested in simulation [5,6,7]. Up until now simulations have either lacked the fidelity to realistically capture LIDAR-vegetation interaction or been computationally slow and difficult to integrate with existing autonomy algorithms [8]

Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.