Abstract

This paper presents our approach to intercepting a faster intruder UAV, inspired by the MBZIRC 2020 Challenge 1. By utilizing a priori knowledge of the shape of the intruder’s trajectory, we can calculate an interception point. Target tracking is based on image processing by a YOLOv3 Tiny convolutional neural network, combined with depth calculation using a gimbal-mounted ZED Mini stereo camera. We use RGB and depth data from the camera, devising a noise-reducing histogram-filter to extract the target’s 3D position. Obtained 3D measurements of target’s position are used to calculate the position, orientation, and size of a figure-eight shaped trajectory, which we approximate using a Bernoulli lemniscate. Once the approximation is deemed sufficiently precise, as measured by the distance between observations and estimate, we calculate an interception point to position the interceptor UAV directly on the intruder’s path. Our method, which we have significantly improved based on the experience gathered during the MBZIRC competition, has been validated in simulation and through field experiments. Our results confirm that we have developed an efficient, visual-perception module that can extract information describing the intruder UAV’s motion with precision sufficient to support interception planning. In a majority of our simulated encounters, we can track and intercept a target that moves 30% faster than the interceptor. Corresponding tests in an unstructured environment yielded 9 out of 12 successful results.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.