Abstract

ABSTRACT The speckle noise found in synthetic aperture radar (SAR) images severely affects the efficiency of image interpretation, retrieval and other applications. Thus, effective methods for despeckling SAR image are required. The traditional methods for SAR image despeckling fail to balance in terms of the relationship between the intensity of speckle noise filtering and the retention of texture details. Deep learning based SAR image despeckling methods have been shown to have the potential to achieve this balance. Therefore, this study proposes a self-attention multi-scale convolution neural network (SAMSCNN) method for SAR image despeckling. The advantage of the SAMSCNN method is that it considers both multi-scale feature extraction and channel attention mechanisms for multi-scale fused features. In the SAMSCNN method, multi-scale features are extracted from SAR images through convolution layers with different depths. These are concatenated; then, and an attention mechanism is introduced to assign different weights to features of different scales, obtaining multi-scale fused features with weights. Finally, the despeckled SAR image is generated through global residual noise reduction and image structure fine-tuning. The despeckling experiments in this study involved a variety of scenes using simulated and real data. The performance of the proposed model was analysed using quantitative and qualitative evaluation methods and compared to probabilistic patch-based (PPB), SAR block-matching 3-D (SAR-BM3D) and SAR-CNN methods. The experimental results show that the method proposed in this paper improves the objective indexes and shows great advantages in visual effects compared to these classical methods. The method proposed in this study can provide key technical support for the practical application of SAR images.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.