Abstract

Much research has been accomplished in the area of drone landing and specifically pose estimation. While some of these works focus on sensor fusion using GPS, or GNSS, we propose a method that uses sensors, including four Time of Flight (ToF) range sensors and a monocular camera. However, when the descending platform is unstable, for example, on ships in the ocean, the uncertainty will grow, and the tracking will fail easily. We designed an algorithm that includes four ToF sensors for calibration and one for pose estimation. The landing process was divided into two main parts, the rendezvous and the final landing. Two important assumptions were made for these two phases. During the rendezvous, the landing platform movement can be ignored, while during the landing phase, the drone is assumed to be stable and waiting for the best time to land. The current research modifies the landing part as a stable drone and an unstable landing platform, which is a Stewart platform, with a mounted AprilTag. A novel algorithm for calibration was used based on color thresholding, a convex hull, and centroid extraction. Next, using the homogeneous coordinate equations of the sensors’ touching points, the focal length in the X and Y directions can be calculated. In addition, knowing the plane equation allows the Z coordinates of the landmark points to be projected. The homogeneous coordinate equation was then used to obtain the landmark’s X and Y Cartesian coordinates. Finally, 3D rigid body transformation is engaged to project the landing platform transformation in the camera frame. The test bench used Software-in-the-Loop (SIL) to confirm the practicality of the method. The results of this work are promising for unstable landing platform pose estimation and offer a significant improvement over the single-camera pose estimation AprilTag detection algorithms (ATDA).

Highlights

  • An introduction to this research is followed by a review of the recent pose estimation techniques for unmanned aerial vehicles (UAVs).with regard to jurisdictional claims in published maps and institutional affil-1.1

  • Figure thecamera calibration process from below ground to some distance above platform and the whole system was placed on the back of a rover

  • Theestimation results of this work reveal notable improvement in translational well as anlar pose compared to the aAprilTag navigation algorithm, but withasthe expense

Read more

Summary

Introduction

An introduction to this research is followed by a review of the recent pose estimation techniques for unmanned aerial vehicles (UAVs).with regard to jurisdictional claims in published maps and institutional affil-1.1. An introduction to this research is followed by a review of the recent pose estimation techniques for unmanned aerial vehicles (UAVs). UAV-enabled services already range from food delivery to moon landing. Several tasks must work together to materialize drone control and flight. These tasks include takeoff, landing, and hovering. To make these tasks operational, a drone must have a full knowledge of its position (or “pose”) in space. Pose estimation is an inseparable part of drone control, the plethora of research relating to these tasks over the past few years

Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call