Abstract

A tracking controller for unmanned aerial vehicles (UAVs) is developed to track moving targets in the presence of occlusion. The controller can track moving targets based on a bounding box of the target detected by a deep neural network using the you-only-look-once (YOLO) method. The features generated from the YOLO approach relaxes the assumption of continuous availability of the feature points for applications, which facilitates estimation using an unscented Kalman filter (UKF) and the design of image-based tracking controller in this work. The challenge is that when occlusion is present, the bounding box of the moving target becomes unobtainable and makes the estimation diverge. To solve this, a motion model derived by quadratic programming is employed as a process model in the UKF, wherein the estimated velocity is implemented as a feedforward term in the developed tracking controller in order to enhance the tracking performance. Since no motion constraint is assumed for the target, the developed controller can be applied to track various moving targets. Simulations are used to demonstrate the performance of the developed estimator and controller in the presence of occlusion. Experiments are also conducted to verify the efficacy of the developed estimator and controller.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call