Abstract

It is presented a monocular RGB vision system to estimate the pose (3D position and orientation) of a fixed-wing Unmanned Aerial Vehicle (UAV) concerning the camera reference frame. Using this estimate, a Ground Control Station (GCS) can control the UAV trajectory during landing on a Fast Patrol Boat (FPB). A ground-based vision system makes it possible to use more sophisticated algorithms since we have more processing power available. The proposed method uses a 3D model-based approach based on a Particle Filter (PF) divided into five stages: (i) frame capture, (ii) target detection, (iii) distortion correction, (iv) appearance-based pose sampler, and (v) pose estimation. In the frame capture stage, we obtain a new observation (a new frame). In the target detection stage, we detect the UAV region on the captured frame using a detector based on a Deep Neural Network (DNN). In the distortion correction stage, we correct the frame radial and tangential distortions to obtain a better estimate. In the appearance-based pose sampler stage, we use a synthetically generated pre-trained database for a rough pose initialization. In the pose estimation stage, we apply an optimization algorithm to be able to obtain a UAV pose estimate in the captured frame with low error. The overall system performance is increased using the Graphics Processing Unit (GPU) for parallel processing. Results show that the GPU computational resources are essential to obtain a real-time pose estimation system.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.