Abstract
The rapid evolution of drone technology has expanded its applications across variodus domains, including delivery services, environmental monitoring, and search and rescue operations. However, many of these applications face significant challenges in GPS-denied environments, such as dense urban areas and heavily forested regions, where traditional navigation methods falter. This paper presents a novel multi-sensor fusion algorithm designed to enhance the localization accuracy of autonomous drones without reliance on GPS. By integrating data from an Inertial Measurement Unit (IMU), LiDAR, and visual sensors, the proposed approach effectively compensates for the limitations of individual sensors, enabling robust navigation in complex environments. Experimental results demonstrate that the algorithm achieves an average localization accuracy of 1.2 meters in urban areas and 1.5 meters in forested settings, showcasing its resilience against sensor noise and environmental challenges. The implementation of loop closure techniques further improves long-term navigation accuracy, making it suitable for prolonged missions. This research contributes to the growing body of knowledge in autonomous drone navigation and offers significant implications for enhancing the operational capabilities of drones in real-world scenarios. Future work will focus on integrating additional sensors, exploring machine learning techniques for adaptive fusion, and conducting extensive field trials to validate the system's performance in dynamic environments..
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have