Abstract

UAVs operating in a leader-follower formation demand the knowledge of the relative pose between the collaborating members. This necessitates the RF-communication of this information which increases the communication latency and can easily result in lost data packets. In this work, rather than relying on this autopilot data exchange, a visual scheme using passive markers is presented. Each formation-member carries passive markers in a RhOct configuration. These markers are visually detected and the relative pose of the members is on-board determined, thus eliminating the need for RF-communication. A reference path is then evaluated for each follower that tracks the leader and maintains a constant distance between the formation-members. Experimental studies show a mean position detection error (5 × 5 × 10cm) or less than 0.0031% of the available workspace [0.5 up to 5m, 50.43° × 38.75° Field of View (FoV)]. The efficiency of the suggested scheme against varying delays are examined in these studies, where it is shown that a delay up to 1.25s can be tolerated for the follower to track the leader as long as the latter one remains within its FoV.

Highlights

  • The use of Unmanned Aerial Vehicles (UAVs) towards autonomous task completion has received increased attention in the past decade

  • The onboard Inertial Measurement Unit (IMU) on a UAV provides both orientation and position information when coupled to a Global Positioning System (GPS)/Global Navigation Satellite System (GNSS) receiver

  • In order to account for the induced UAV-vibrations in measuring their pose, two drones in a hovering condition facing each other are used in this study

Read more

Summary

INTRODUCTION

The use of Unmanned Aerial Vehicles (UAVs) towards autonomous task completion has received increased attention in the past decade. In. GPS-denied indoor environments, other sensing modalities are deployed for estimating the drone position within a swarm. GPS-denied indoor environments, other sensing modalities are deployed for estimating the drone position within a swarm These include LiDaR sensing (Kumar et al, 2017; Tsiourva and Papachristos, 2020; Yang et al, 2021), RSSI measurements (Yokoyama et al, 2014; Xu et al, 2017; Shin et al, 2020), RFbased sensing (Zhang et al, 2019; Shule et al, 2020; Cossette et al, 2021) and visual methods (Lu et al, 2018)

Visual Relative Pose Estimation Sensors and Techniques
Limitations of Pose Estimation Systems
Contributions
VISUAL-ASSISTED RELATIVE POSE ESTIMATION
LINEARIZED UAV-DYNAMICS
Experimental Setup
Relative Pose Measurement With Stationary UAVs
Relative Pose Measurement With Hovering UAVs
Leader-Follower Scenario Evaluation
Extension to a Multiple-Agent Scenario
Findings
CONCLUSIONS

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.