Abstract

With the rapid development of UAVs (Unmanned Aerial Vehicles), abnormal state detection has become a critical technology to ensure the flight safety of UAVs. The position and orientation system (POS) data, etc., used to evaluate UAV flight status are from different sensors. The traditional abnormal state detection model ignores the difference of POS data in the frequency domain during feature learning, which leads to the loss of key feature information and limits the further improvement of detection performance. To deal with this and improve UAV flight safety, this paper presents a method for detecting the abnormal state of a UAV based on a timestamp slice and multi-separable convolutional neural network (TS-MSCNN). Firstly, TS-MSCNN divides the POS data reasonably in the time domain by setting a set of specific timestamps and then extracts and fuses the key features to avoid the loss of feature information. Secondly, TS-MSCNN converts these feature data into grayscale images by data reconstruction. Lastly, TS-MSCNN utilizes a multi-separable convolution neural network (MSCNN) to learn key features more effectively. The binary and multi-classification experiments conducted on the real flight data, Air Lab Fault and Anomaly (ALFA), demonstrate that the TS-MSCNN outperforms traditional machine learning (ML) and the latest deep learning methods in terms of accuracy.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.