Abstract

The discriminative object tracking system for unmanned aerial vehicles (UAVs) is widely used in numerous applications. While an ample amount of research has been carried out in this domain, implementing a low computational cost algorithm on a UAV onboard embedded system is still challenging. To address this issue, we propose a low computational complexity discriminative object tracking system for UAVs approach using the patch color group feature (PCGF) framework in this work. The tracking object is separated into several non-overlapping local image patches then the features are extracted into the PCGFs, which consist of the Gaussian mixture model (GMM). The object location is calculated by the similar PCGFs comparison from the previous frame and current frame. The background PCGFs of the object are removed by four directions feature scanning and dynamic threshold comparison, which improve the performance accuracy. In the terms of speed execution, the proposed algorithm accomplished 32.5 frames per second (FPS) on the x64 CPU platform without a GPU accelerator and 17 FPS in Raspberry Pi 4. Therefore, this work could be considered as a good solution for achieving a low computational complexity PCGF algorithm on a UAV onboard embedded system to improve flight times.

Highlights

  • Unmanned aerial vehicles (UAVs) have rapidly evolved, and there are lots of examples of UAV analysis applications in various fields, such as transportation engineering systems [1], UAV bridge inspection platforms [2], UAV-based traffic analysis [3], and oil pipeline patrol factory inspection [4], etc

  • The UAV object tracking algorithms are categorized into the deep learning (DL) method and the generic method

  • The discriminative correlation filter (DCF)-based tracking system achieves a good mix of speed and accuracy

Read more

Summary

Introduction

Unmanned aerial vehicles (UAVs) have rapidly evolved, and there are lots of examples of UAV analysis applications in various fields, such as transportation engineering systems [1], UAV bridge inspection platforms [2], UAV-based traffic analysis [3], and oil pipeline patrol factory inspection [4], etc. After defining the characteristic values (Pm,n) of every patch, the step is removing the background patches from the object window. The proposed algorithm extracts patches and finds a threshold window from the background window for every vertical and horizontal patch. The upper half part of the object patches is scanned and expressed as the label by the following Equation (8). When the upper half part of the object patch is scanned, (m, n) is the uppermost threshold (thr) coordinate, and k represents the displacement value. The thr value is compared with the characteristics of the upper half of the object patch (Hobj). To reduce the computation, when Labelm,n+k−1 is determined as object patch, the remaining blocks require assessment that can be directly identified as the object patch

Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call