Abstract

The main difficulty of synthetic aperture radar (SAR)-optical image matching or registration lies in the significant heterogeneous characteristics introduced by the different imaging mechanisms between SAR and optical images. Instead of directly using the raw image pair, transforming the pair into a feature domain, where they have homogeneous feature representation, is believed more effective. Inspired by image segmentation, we develop an end-to-end deep learning model for the SAR-optical matching, based on a siamese U-net with a fast Fourier transform (FFT) correlation layer. First, the siamese U-net with sharing weights extracts the feature maps of the SAR and optical images and projects the heterogeneous images into a homogeneous space. Then, the two feature maps are cross-correlated or normalized cross-correlated by the FFT layer and a similarity heatmap is obtained. Finally, the heatmap is send into a softmax2d classifier to determine the best matching, and thus matching is converted into classification. The nonlinear mapping capability of deep learning can well tackle the intensity variation across the different imaging modals; the encoder–decoder architecture with skip connections in the U-net can take full advantage of the global information and simultaneously preserve the local resolution and position information and thus guarantees high accuracy and robustness; besides, the FFT correlation is helpful for the efficiency improvement and training with large image pairs. Experiments show that the proposed method can achieve a pixel-level matching error.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.