Abstract

Template matching between synthetic aperture radar (SAR) and optical image is a meaningful and important topic for remote sensing tasks. In this letter, a deep learning framework based on Siamese structure and U-Net is proposed. First, a sparse 3-D transform-domain collaborative filtering (BM3D) for SAR image denoising is applied. Then, deep features of SAR and optical images are extracted from two branches of the proposed Siamese U-Net network. Finally, a cross correlation layer is used to generate the heatmap of image pairs based on the deep features extracted, and the peak maximal location of the heatmap is seen as the best matching location result. During training, a balanced loss between ground truth and prediction is used to get trained model weights in order to make a balance between positive samples and negative samples. The experiments on the same test dataset demonstrate that the proposed method improves the matching accuracy and precision to some degree in comparison with existing template matching methods for SAR and optical images.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.