Abstract

A nonlinear guidance law for autonomously tracking a cooperative airborne target with a fixed, forward-looking camera is proposed. The method is called visual pursuit and it is intended for use with a vision sensor, but it is also compatible with inputs from other sensors like GPS. Line-of-sight angles are input into the guidance method that produces roll and flight path angle rate commands for tracking an aerial target and following it in close formation. Visual pursuit is based on the formation of a Lyapunov-based scalar system that is shown to be mathematically stable for a cooperative target. The method is shown to be effective at keeping the target in the camera field of view in strong cross winds by allowing the target position to float laterally within the camera field of view. The amount of movement of the target is governed by an optimization algorithm that keeps the line-of-sight errors within a specified region of the camera field of view. For comparison, a linear pursuit method is also implemented. Results showed effective tracking with less bank angle effort in strong winds. Simulation and flight test results are shown to demonstrate the suitability of the method.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call