Abstract

Foreign object debris (FOD) can critically damage aircraft engines as well as injure personnel in an airport environment. Airfield inspectors routinely inspect runways for the presence of FOD items, which differ in material, shape, and color, using conventional and automated methods. The major shortcoming of the current methods is their inability to detect all types of foreign objects in an accurate and timely manner for removal from the airport runways. In this study, we address this shortcoming, i.e., the lack of accuracy and timeliness in detection, by developing an object detection framework to detect FOD for quick removal from the airfields. Our proposed FOD detection framework consists of (i) unmanned aerial system (UAS) for inspecting and collecting data from the airfields, (ii) data preprocessing and augmentation techniques to counter the issue of learning on limited types of foreign objects, weather conditions, and airport surface materials that are present in the data sets, and (iii) a computer vision-based object detection model to attain high accuracy and faster inference time for deployment in a real-world airport environment. We generated the training data with the UAS at an air force range and developed various models, including the You Only Look Once (YOLO) object detector family of models (one-stage object detectors) in our framework. Our models are evaluated on previously unseen data collected by the UAS and a publicly available data set of FOD images. The experiment results demonstrate that our proposed approach with a version of the YOLO model (YOLOv4) with transfer learning provides a faster inference time for FOD detection and outperforms the other models in the precision and recall metric values.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call