Abstract
With the ability to locate subtle trace objects in the large-scale region, coherent change detection (CCD) has been vital research for a synthetic aperture radar (SAR) system. Finding the difference between repeat-pass repeat-geometry SAR image pair and extracting impressive trace pixels from difference image, the SAR CCD methods consist of a difference generation module and a difference analysis module. The previous CCD methods mainly pay attention to devising a sophisticated working system or an appropriate statistic model to generalize a well difference image. In this article, we introduce the deep learning method into the CCD algorithm and propose a novel trace detection paradigm, which works by hierarchically fusing the unsupervised coherent statistics model and supervised deep learning model. To be specific, the complex reflectance change detection estimator is introduced to generate a difference image and reduce the false alarm in the low clutter-to-noise region. Since the low correlation in a difference image caused by the natural factors severely affects the detection performance, the multiple statistics based on intensity summation and intensity difference are, respectively, proposed to extract water region and vegetation region and suppress the corresponding false alarm. Then the construction of the coarse-to-fine image makes use of land cover information and trace features while the compressed Unet improves the utilization efficiency of trace samples. Meanwhile, the inductive transfer learning based on unsupervised pretraining and few labeled trace samples helps to train a well detection model. Experiments on measured SAR data demonstrate the effectiveness of proposed methods.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Geoscience and Remote Sensing
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.