Abstract

The contribution of dual-polarized synthetic aperture radar (SAR) to optical data for the accuracy of land use classification is investigated. For this purpose, different image fusion algorithms are implemented to achieve spatially improved images while preserving the spectral information. To compare the performance of the fusion techniques, both the microwave X-band dual-polarized TerraSAR-X data and the multispectral (MS) optical image RapidEye data are used. Our test site, Gediz Basin, covers both agricultural fields and artificial structures. Before the classification phase, four data fusion approaches: (1) adjustable SAR-MS fusion, (2) Ehlers fusion, (3) high-pass filtering, and (4) Bayesian data fusion are applied. The quality of the fused images was evaluated with statistical analyses. In this respect, several methods are performed for quality assessments. Then the classification performances of the fused images are also investigated using the support vector machines as a kernel-based method, the random forests as an ensemble learning method, the fundamental k-nearest neighbor, and the maximum likelihood classifier methods comparatively. Experiments provide promising results for the fusion of dual polarimetric SAR data and optical data in land use/cover mapping.

Highlights

  • A wide variety of remote sensing satellite sensors provide data with diverse spectral and spatial resolution for the observation of many phenomena on the Earth

  • This paper extends the previous study of Ref. 10, which focuses on the fusion of the RapidEye data with VV polarized TerraSAR-X synthetic aperture radar (SAR) data

  • We conclude that the TSX VH fused Bayesian data fusion (BDF)-II image is the best fused image, in that it preserves the spectral information of the RapidEye data better than the other results

Read more

Summary

Introduction

A wide variety of remote sensing satellite sensors provide data with diverse spectral and spatial resolution for the observation of many phenomena on the Earth. Land use and cover mapping require both the high spectral and spatial resolution for an accurate analysis and interpretation. The data fusion is a key preprocessing method to integrate multisensor and multiresolution images, which has advantages over the result of each individual data set.[1,2] Image fusion is an active topic for researching the performance of image fusion techniques for different sensors using both qualitative and quantitative analyses.[3] Previous studies in the literature proved that the fusion of the synthetic aperture radar (SAR) and multispectral (MS) images improves the spatial information while preserving the spectral information.[4,5]. As part of image processing, several approaches of data fusion methods were proposed, and the contributions of the fusion techniques in image classification accuracies were studied.[3,6] For different applications of the SAR and MS data fusion, various satellite images were used and various results have been achieved for each study.[4,7,8] A generalized intensity modulation for the Journal of Applied Remote Sensing

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call