Abstract

Object detection is one of the most important tasks in remote sensing image analysis. A hot topic is aircraft detection. One challenge of aircraft detection is that the aircraft is relative small compared with the image, i.e., there are about fifteen million pixels per image in our aircraft data set, while the aircraft only accounts for about three thousand pixels. The large size difference between the image and the object makes it impossible to use a general object detection method to detect the aircraft in the remote sensing image. Another challenge of aircraft detection is that the sizes of aircrafts are various, i.e., the scale span of the aircraft is large due to the shooting distance and the scale of the aircraft itself. In order to solve these two problems, in this paper, we propose a new scheme containing two special networks. The first network is a background filtering network designed to crop partial areas where aircrafts may exist. The second network is a scale prediction network mounted on Faster R-CNN to recognize the scales of aircrafts contained in the areas that cropped by the first network. The scale problem is well solved, though the two networks are relatively simple in structure. Experiments on the aircraft data set show that our background filtering network can crop the areas containing the aircrafts from remote sensing images and our scale prediction network has improvement on the precision rate, recall rate and mean average precision (mAP).

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.