Abstract

Building change detection (CD), important for its application in urban monitoring, can be performed in near real time by comparing prechange and postchange very-high-spatial-resolution (VHR) synthetic-aperture-radar (SAR) images. However, multitemporal VHR SAR images are complex as they show high spatial correlation, prone to shadows, and show an inhomogeneous signature. Spatial context needs to be taken into account to effectively detect a change in such images. Recently, convolutional-neural-network (CNN)-based transfer learning techniques have shown strong performance for CD in VHR multispectral images. However, its direct use for SAR CD is impeded by the absence of labeled SAR data and, thus, pretrained networks. To overcome this, we exploit the availability of paired unlabeled SAR and optical images to train for the suboptimal task of transcoding SAR images into optical images using a cycle-consistent generative adversarial network (CycleGAN). The CycleGAN consists of two generator networks: one for transcoding SAR images into the optical image domain and the other for projecting optical images into the SAR image domain. After unsupervised training, the generator transcoding SAR images into optical ones is used as a bitemporal deep feature extractor to extract optical-like features from bitemporal SAR images. Thus, deep change vector analysis (DCVA) and fuzzy rules can be applied to identify changed buildings (new/destroyed). We validate our method on two data sets made up of pairs of bitemporal VHR SAR images on the city of L’Aquila (Italy) and Trento (Italy).

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.