Abstract
Synthetic aperture radar (SAR) ship detection is an important part of marine monitoring. With the development in computer vision, deep learning has been used for ship detection in SAR images such as the faster region-based convolutional neural network (R-CNN), single-shot multibox detector, and densely connected network. In SAR ship detection field, deep learning has much better detection performance than traditional methods on nearshore areas. This is because traditional methods need sea–land segmentation before detection, and inaccurate sea–land mask decreases its detection performance. Though current deep learning SAR ship detection methods still have many false detections in land areas, and some ships are missed in sea areas. In this letter, a new network architecture based on the faster R-CNN is proposed to further improve the detection performance by using squeeze and excitation mechanism. In order to improve performance, first, the feature maps are extracted and concatenated to obtain multiscale feature maps with ImageNet pretrained VGG network. After region of interest pooling, an encoding scale vector which has values between 0 and 1 is generated from subfeature maps. The scale vector is ranked, and only top $K$ values will be preserved. Other values will be set to 0. Then, the subfeature maps are recalibrated by this scale vector. The redundant subfeature maps will be suppressed by this operation, and the detection performance of detector can be improved. The experimental results based on Sentinel-1 images show that the detection performance of the proposed method achieves 0.836 which is 9.7% better than the state-of-the-art method when using F1 as matric and executes 14% faster.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.