Abstract

In this study, this paper focus on the vision-based autonomous helicopter unmanned aerial vehicle (UAV) landing problems. This paper proposed a multisensory fusion to autonomous landing of an UAV. The systems include an infrared camera, an Ultra-wideband radar that measure distance between UAV and Ground-Based system, an PAN-Tilt Unit (PTU). In order to identify all weather UAV targets, we use infrared cameras. To reduce the complexity of the stereovision or one-cameral calculating the target of three-dimensional coordinates, using the ultra-wideband radar distance module provides visual depth information, real-time Image-PTU tracking UAV and calculate the UAV threedimensional coordinates. Compared to the DGPS, the test results show that the paper is effectiveness and robustness.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.