Abstract

AbstractPath planning systems using graph‐search algorithms such as A* usually operate in uniform plan‐view occupancy grids. However, the sensors used to construct these grids observe the environment in their own sample space based on sensor type and viewpoint. In this paper we present an image space technique for path planning in unknown unstructured outdoor environments. Our method differs from previous techniques in that we perform path search directly in image space—the native sensor space of the imaging sensor. After an image space path has been found, it is used for navigation in the real world. By operating at the resolution of the image sensor, image space planning facilitates accurate robot vs. obstacle localization and enables a high degree of movement precision. Our image space planning techniques can potentially be used with many different kinds of sensor data, and we experimentally evaluate the use of stereo disparity and color information. We present an extension to the basic image space planning system called the cylindrical planner that simulates a 2π field of view with a cylindrically shaped occupancy grid. We believe that image space planning is well suited for use in the local subsystem of a hierarchical planner and implement a hybrid hierarchical planner that utilizes the cylindrical planner as a local planning subsystem and a two‐dimensional Cartesian planner as the global planning subsystem. All three systems are implemented and experimentally tested on a real robot. We evaluate the failure modes of image space planning and discuss how to avoid them. We find that image space enables precise real‐time, near‐field planning. © 2009 Wiley Periodicals, Inc.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call