Abstract

Due to the complexity of airport background and runway structure, the performances of most runway extraction methods are limited. Furthermore, at present, the military fields attach greater importance to semantic changes of some objects in the airport, but few studies have been done on this subject. To address these issues, this paper proposes an accurate runway change analysis method, which comprises two stages: airport runway extraction and runway change analysis. For the former stage, some airport knowledge, such as chevron markings and runway edge markings, are first applied in combination with multiple features of runways to improve the accuracy. In addition, the proposed method can accomplish airport runway extraction automatically. For the latter, semantic information and vector results of runway changes can be obtained simultaneously by comparing bi-temporal runway extraction results. In six test images with about 0.5-m spatial resolution, the average completeness of runway extraction is nearly 100%, and the average quality is nearly 89%. In addition, the final experiment using two sets of bi-temporal very high-resolution (VHR) images of runway changes demonstrated that semantic results obtained by our method are consistent with the real situation and the final accuracy is over 80%. Overall, the airport knowledge, especially chevron markings for runways and runway edge markings, are critical to runway recognition/detection, and multiple features of runways, such as shape and parallel line features, can further improve the completeness and accuracy of runway extraction. Finally, a small step has been taken in the study of runway semantic changes, which cannot be accomplished by change detection alone.

Highlights

  • Due to its significance in both civil and military fields, the airport has gained increased attention in recent years [1,2] and methods for detecting and extracting airports have been widely developed, such as methods based on visual saliency [3,4,5,6,7,8] and methods based on deep learning [2,9,10,11]

  • To get more accurate results of runway extraction, the similarity threshold T for grayscale template matching of the chevron marking for runways in Section 2.2.3 and length thresholds L1 and L2 for line segment detection based on Probabilistic Hough Transform in Section 2.2.4 needed to be tuned in this study

  • From the perspective of airport knowledge, we propose a novel airport runway change analysis method to overcome accuracy limitations of runway extraction caused by the complexity of airport background and runway structure, and semantic ambiguity and accuracy limitations of runway changes caused by change detection methods

Read more

Summary

Introduction

Due to its significance in both civil and military fields, the airport has gained increased attention in recent years [1,2] and methods for detecting and extracting airports have been widely developed, such as methods based on visual saliency [3,4,5,6,7,8] and methods based on deep learning [2,9,10,11]. With the construction, reconstruction, and expansion of airports around the world, major changes have taken place in airports, such as the construction and movement of runways, the construction of terminal buildings and aprons, and so on. These changes, especially semantic changes of runways, can support decision for relevant departments, the research on airport runway change analysis was rarely explored, because the accuracy of some existing change detection methods [13,14] was not very high and these methods mainly focused on changes of general objects, such as forest into farmland, rather than finer changes within a typical object

Methods
Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.