Abstract

This paper presents a novel system for autonomous, vision-based drone racing combining learned data abstraction, nonlinear filtering, and time-optimal trajectory planning. The system has successfully been deployed at the first autonomous drone racing world championship: the 2019 AlphaPilot Challenge. Contrary to traditional drone racing systems, which only detect the next gate, our approach makes use of any visible gate and takes advantage of multiple, simultaneous gate detections to compensate for drift in the state estimate and build a global map of the gates. The global map and drift-compensated state estimate allow the drone to navigate through the race course even when the gates are not immediately visible and further enable to plan a near time-optimal path through the race course in real time based on approximate drone dynamics. The proposed system has been demonstrated to successfully guide the drone through tight race courses reaching speeds up to {8},{hbox {m}/hbox {s}} and ranked second at the 2019 AlphaPilot Challenge.

Highlights

  • 1.1 MotivationAutonomous drones have seen a massive gain in robustness in recent years and perform an increasingly large set of tasks across various commercial industries; they are still far from fully exploiting their physical capabilities

  • Autonomous navigation in indoor or GPS-denied environments typically relies on simultaneous localization and mapping (SLAM), often in the form of visual-inertial odometry (VIO) Cadena et al (2016)

  • From more than 400 teams that participated in a series of qualification tests including a simulated drone race Guerra et al (2019), the top nine teams were selected to compete in the 2019 AlphaPilot Challenge

Read more

Summary

Motivation

Autonomous drones have seen a massive gain in robustness in recent years and perform an increasingly large set of tasks across various commercial industries; they are still far from fully exploiting their physical capabilities. Video of our approach:https://youtu.be/DGjwm5PZQT8 Talk at RSS 2020:https://youtu.be/k6vGEj1ZZWc RSS paper:http://rpg.ifi.uzh.ch/docs/RSS20_Foehn.pdf. The goal of the challenge is to develop a fully autonomous drone that navigates through a race course using machine vision, and which could one day beat the best human pilot. While other autonomous drone races Moon et al (2017, 2019) focus on complex navigation, the AlphaPilot Challenge pushes the limits in terms of speed and course size to advance the state of the art and enter the domain of human performance. Due to the high speeds at which drones must fly in order to beat the best human pilots, the challenging visual environments (e.g., low light, motion blur), and the limited computational power of drones, autonomous drone racing

Related work
Contribution
Race format
Drone specifications
Drone model
System overview
Perception
Planning and control
State estimation
Stage 1: predicting corner maps and part affinity fields
Gate detection
Part affinity fields
Stage 2: corner association
Training data
Measurement modalities
Gate measurements
Gate correspondences
Laser rangefinder measurement
Path planning
Time-optimal motion primitive
Sampling-based receding horizon path planning
Path parameterization
Attitude control
Results
Discussion and conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.