Abstract

To safely navigate and avoid obstacles in a complex dynamic environment, autonomous drones need a reaction time less than 10 milliseconds. Thus, event-based cameras have increasingly become more widespread in the academic research field for dynamic obstacles detection and avoidance for UAV, as their achievements outperform their frame-based counterparts in term of low-latency. Several publications showed significant results using these sensors. However, most of the experiments relied on indoor data. After a short introduction explaining the differences and features of an event-based camera compared to traditional RGB camera, this work explores the limits of the state-of-art event-based algorithms for obstacles recognition and detection by expanding their results from indoor experiments to real-world outdoor experiments. Indeed, this paper shows the inaccuracy of event-based algorithms for recognition due to insufficient amount of events generated and the inefficiency of event-based obstacles detection algorithms due to the high ration of noise.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call