Abstract

Deep Learning-based object detectors enhance the capabilities of remote sensing platforms, such as Unmanned Aerial Vehicles (UAVs), in a wide spectrum of machine vision applications. However, the integration of deep learning introduces heavy computational requirements, preventing the deployment of such algorithms in scenarios that impose low-latency constraints during inference, in order to make mission-critical decisions in real-time. In this paper, we address the challenge of efficient deployment of region-based object detectors in aerial imagery, by introducing an informed methodology for extracting candidate detection regions (proposals). Our approach considers information from the UAV on-board sensors, such as flying altitude and light-weight computer vision filters, along with prior domain knowledge to intelligently decrease the number of region proposals by eliminating false-positives at an early stage of the computation, reducing significantly the computational workload while sustaining the detection accuracy. We apply and evaluate the proposed approach on the task of vehicle detection. Our experiments demonstrate that state-of-the-art detection models can achieve up to 2.6x faster inference by employing our altitude-aware data-driven methodology. Alongside, we introduce and provide to the community a novel vehicle-annotated and altitude-stamped dataset of real UAV imagery, captured at numerous flying heights under a wide span of traffic scenarios.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.