Abstract

In this paper, the local correspondence between synthetic aperture radar (SAR) images and optical images is proposed using an image feature-based keypoint-matching algorithm. To achieve accurate matching, common image features were obtained at the corresponding locations. Since the appearance of SAR and optical images is different, it was difficult to find similar features to account for geometric corrections. In this work, an image translator, which was built with a DNN (deep neural network) and trained by conditional generative adversarial networks (cGANs) with edge enhancement, was employed to find the corresponding locations between SAR and optical images. When using conventional cGANs, many blurs appear in the translated images and they degrade keypoint-matching accuracy. Therefore, a novel method applying an edge enhancement filter in the cGANs structure was proposed to find the corresponding points between SAR and optical images to accurately register images from different sensors. The results suggested that the proposed method could accurately estimate the corresponding points between SAR and optical images.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call