Abstract
Abstract. The Kennaugh framework turned out to be a powerful tool for the preparation of multi-sensor SAR data during the last years. Using intensity-based (an-) isotropic diffusion algorithms like the Multi-scale Multi-looking or the Schmittlets, even robust pre-classification change detection from multi-polarized images is enabled. The only missing point so far, namely the integration of multi-mode SAR data in one image, is accomplished in this article. Furthermore, the Kennaugh decomposition is extended to multi-spectral data as well. Hence, arbitrary Kennaugh elements, be it from SAR or optical images, can be fused. The mathematical description of the most general image fusion is derived and applied to four scenarios. The validation section considers the distribution of mean and gradient in the original and the fused images by the help of scatter plots. The results prove that the fused images adopt the spatial gradient of the input image with a higher geometric resolution and preserve the local mean of the input image with a higher polarimetric and thus also radiometric resolution. Regarding the distribution of the entropy and alpha angle, the fused images are always characterized by a higher variance in the entropy-alpha-plane and therewith, a higher resolution in the polarimetric domain. The proposed algorithm guarantees optimal information integration while ensuring the separation of intensity and polarimetric/spectral information. The Kennaugh framework is ready now to be used for the sharpening of multi-sensor image data in the spatial, radiometric, polarimetric, and even spectral domain.
Highlights
Earth observation satellites with their diversity of sensors provide a variety of spectral, geometric, temporal, and radiometric resolutions
This section illustrates the results of the data fusion approach: Scenario 1 - A quad-pol image acquisition of ALOS-PALSAR-2 is fused with a dual-co-pol spotlight image of TerraSAR-X in order to slightly enhance the spatial resolution and to stabilize the co-polarized information according to Sec. 4.3, see Fig. 1
Scenario 3 - The intensity of a quad-pol image acquired by ALOS-PALSAR-2 is replaced by the total intensity of the channels measured by an airborne camera in order to enhance the spatial resolution according to Sec. 4.2, see Fig. 3
Summary
Earth observation satellites with their diversity of sensors provide a variety of spectral, geometric, temporal, and radiometric resolutions Their rising number raises the issue of image fusion in order to enhance interpretation capabilities of image features (Pohl and van Genderen, 1998; Abdikan et al, 2008) and to reduce the amount of data at the same time. With respect to the interpretation of backscatter values, this immediately leads to an increase of the information content (Simone et al, 2001; Farina et al, 1996) This image fusion is novel and promising as it supports the understanding and interpretation of SAR image features due to different electromagnetic signatures. Four scenarios are designed in order to prove the added value of the fused image: (1) traditional SARSharpening in the spatial domain, (2) SyntheticQuadPol, (3) SAR-Sharpening involving a pan-chromatic image, and (4) the fusion of SAR and optical features provided by the Sentinel-1&2
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have