Abstract

This paper presents an almost unsupervised fusion algorithm on linear features (LF) extraction in synthetic aperture radar (SAR) interferometric data, in particular for mangroves/shorelines and thin internal channels. The spatial information on LFs is first extracted in the coherence image, where they are wider and more visible: water regions (in particular thin internal channels) are dark areas (low coherence) due to the temporal decorrelation of backscattering signals in these and surrounding regions, whereas conventional vegetation regions are brighter areas (high coherence). These approximate locations of LFs are further refined by using the edge map coming from a semantic fuzzy fusion of the coefficient of variation (CV) and the ratio of local means (RLM) measured in the amplitude image. The final detection of LFs is then performed by merging the two fuzzy inputs: the spatial information and the edge location map. The membership degree statistics of CV and RLM semantic fusion measures are introduced in order to illustrate the location detection ability. The originality of this method in comparison with conventional approaches is in the fusion scheme that follows the interpreter behavior by using first the coherence image for a fuzzy detection where thin LFs are more visible, but have low location accuracy, and then the amplitude image where they are poorly visible, but with higher location accuracy, to obtain improved results. A quantitative performance evaluation is also presented. The method has been applied on real interferometric SAR images from European Remote Sensing satellites over the western part of Cameroon.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.