Abstract

The increasing popularity of small drones has stressed the urgent need for an effective drone‐oriented surveillance system that can work day and night. Herein, an acoustic and optical sensor‐fusion‐based system‐termed multimodal unmanned aerial vehicle 3D trajectory exposure system (MUTES) is presented to detect and track drone targets. MUTES combines multiple sensor modules including microphone array, camera, and lidar. The 64‐channel microphone array provides semispherical surveillance with high signal‐to‐noise ratio of sound source estimation, while the long‐range lidar and the telephoto camera are capable of subsequent precise target localization in a narrower but higher definition field of view. MUTES employs a coarse‐to‐fine, passive‐to‐active localization strategy for wide‐range detection (semispherical) and high‐precision 3D tracking. To further increase the fidelity, an environmental denoising model is trained, which helps to select valid acoustic features from a drone target, thus overcomes the drawbacks of the traditional sound source localization approaches when facing noise interference. The effectiveness of the proposed sensor‐fusion approach is validated through field experiments. To the best of the knowledge, MUTES provides the farthest detection range, highest 3D position accuracy, strong anti‐interference capability, and acceptable cost for unverified drone intruders.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.