Abstract

Autonomous vehicle control can benefit from the abstraction of data from multiple sensors into an information stream containing only relevant aspects of the environment. We use this abstracted information stream to create a virtual environment. The virtual environment differs from the real environment in that only objects relevant to the vehicle are identified and mapped in the virtual environment. We are developing an Autonomous Surface Vehicle (ASV) to compete in the 2009 ASV challenge sponsored by the Association for Unmanned Vehicle Systems International and Office of Naval Research. The ASV is equipped with a sensor suite including two forward looking color CCD cameras. By applying image processing and computer vision techniques, such as edge detection, blob detection, and stereo image disparity, the ASV generates abstracted electro-optical data that is combined with additional sensor data to continuously map the virtual environment with measurements of elements of the real world relevant to the competition tasks. The virtual environment provides all information used in task and objective completion, including waypoint and buoy navigation, target identification and elimination, friendly identification and recovery, docking, and obstacle avoidance. The fusion of sensor data to form a virtual environment is a prime goal in the ASV's design allowing for the simplification of task, objective and behavioral programming. In this paper, we will report on the design of the virtual environment fusion algorithm for waypoint and buoy navigation. We also discuss techniques used in the virtual environment update.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call