Abstract

Small unmanned aerial systems (UASs) present many potential solutions and enhancements to industry today but equally pose a significant security challenge. We only need to look at the levels of disruption caused by UASs at airports in recent years. The accuracy of UAS detection and classification systems based on radio frequency (RF) signals can be hindered by other interfering signals present in the same frequency band, such as Bluetooth and Wi-Fi devices. In this paper, we evaluate the effect of real-world interference from Bluetooth and Wi-Fi signals concurrently on convolutional neural network (CNN) feature extraction and machine learning classification of UASs. We assess multiple UASs that operate using different transmission systems: Wi-Fi, Lightbridge 2.0, OcuSync 1.0, OcuSync 2.0 and the recently released OcuSync 3.0. We consider 7 popular UASs, evaluating 2 class UAS detection, 8 class UAS type classification and 21 class UAS flight mode classification. Our results show that the process of CNN feature extraction using transfer learning and machine learning classification is fairly robust in the presence of real-world interference. We also show that UASs that are operating using the same transmission system can be distinguished. In the presence of interference from both Bluetooth and Wi-Fi signals, our results show 100% accuracy for UAV detection (2 classes), 98.1% (+/−0.4%) for UAV type classification (8 classes) and 95.4% (+/−0.3%) for UAV flight mode classification (21 classes).

Highlights

  • Accepted: 26 June 2021Deloitte Economics recently reported that unmanned aerial systems (UASs) could tackle issues in industry problems, such as logistics and road congestion [1]

  • We have shown that UASs can prove a serious security challenge, especially in airfield scenarios, detection and classification can be achieved amongst realworld interference

  • Using convolutional neural network (CNN) feature extraction with transfer learning and machine learning classifiers, UASs operating with the same transmission systems can be distinguished amongst concurrent Bluetooth and Wi-Fi signals

Read more

Summary

Introduction

Deloitte Economics recently reported that UASs could tackle issues in industry problems, such as logistics and road congestion [1]. Schumann et al [32] use a convolutional neural network (CNN) to detect UASs for the birds vs drones competition They find that when they increase the training data to include images from the web from many varied scenarios, their results increase in accuracy. Coluccia et al [34] consider the results of the 2020 drones vs birds competition and in particular consider the top three performing algorithms They observed the biggest challenges to be detection at a distance and moving cameras, recommending that training and test data be expanded to include this. Semkin et al [37] consider radar cross-sections for detection and classification in urban environments and suggest further work to include the use of machine learning. The paper is constructed as follows: Section 2 discusses the methodology, Section 3 the results and Section 4 the conclusions

Experimental Setup
Graphical Signal Representation
Classification
Results
Discussion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call