Abstract
The detection of small infrared targets with a low signal-to-noise ratios and contrasts in noisy and cluttered backgrounds is challenging and therefore a domain of active research. Traditional methods result in a large number of false alarms and missed detections. In the case of convolutional neural network-based methods, it may not be possible to identify deep small targets, or the details of the target’s edge contours may not be appropriately considered. Therefore, this paper proposes MSAFFNet to perform infrared small target detection based on an encoder-decoder framework. In the encoder stage, small target features are extracted using a resnet-20 backbone network, and the global contextual features of small targets are extracted using an atrous spatial pyramid pooling module. In the decoding stage, a dual-attention module is used to selectively enhance the spatial details of the target at the shallow level and representative features of the semantic information at the deep level. Multi-scale feature maps are then concatenated to achieve superior feature fusion. Additionally, multi-scale labels are constructed to focus on the details of the target contour and internal features based on edge information and an internal feature aggregation module. Experiments conducted on the NUAA-SIRST, NUDT-SIRST and XDU-SIRST datasets revealed that the proposed approach outperforms the representative methods and achieves an improved detection performance.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Geoscience and Remote Sensing
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.