Abstract
Unmanned air systems with video capturing systems for surveillance and visual tracking of ground targets have worked relatively well when employing gimbaled cameras controlled by two or more operators: one to fly the vehicle, and one to orient the camera and visually track ground targets. However, autonomous operation to reduce operator workload and crew levels is more challenging when the camera is strapdown, or fixed to the airframe without a pan-and-tilt capability, rather than gimbaled, so that the vehicle must be steered to orient the camera field of view. Visual tracking becomes even more difficult when the target follows an unpredictable path. This paper investigates a machine learning algorithm for visual tracking of stationary and moving ground targets by unmanned air systems with nongimbaling, fixed pan-and-tilt cameras. The algorithm is based on learning, and the learning agent initially determines an offline control policy for vehicle orientation and flight path such that a target can be tracked in the image frame of the camera without the need for operator input. Performance of the control policy is demonstrated with simulation test case scenarios for stationary, linear, and random moving targets with changes in target speeds and trajectories. Monte Carlo results presented in the paper demonstrate that the learned policies are capable of tracking stationary and moving targets with path perturbations, provided the perturbations are small. The learned policies are robust to small changes in target trajectory; therefore, learning separate policies for every type of trajectory is not required. The approach is judged to have merit for autonomous visual tracking of both fixed and randomly moving ground targets.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.