Abstract

Image fusion is used for many purposes. Very often it is used to produce an image with an improved spatial resolution. The most common situation is represented by a pair of images where the first acquired by a multispectral sensor has a pixel size greater than the pixel size of the second image acquired by a panchromatic sensor. Combining these images, fusion produces a new multispectral image with a spatial resolution equal to the panchromatic one. In addition, image fusion introduces important distortion on the pixel spectra which in turn improve the information content of remote sensing (RS) images (Teggi et al. 2003). Over the years, different fusion methods have been developed for improving spatial and spectral resolutions of RS data sets. The techniques most encountered in the literature are the intensityhue-saturation (IHS) transform, the Brovey transform, the principal components analysis (PCA) method, the Gram-Schmidt method, the local mean matching method, the local mean and variance matching method, the least square fusion method, the wavelet-based fusion method, the multiplicative and the Ehlers Fusion (Karathanassi et al. 2007, Ehlers et al. 2008). Most fusion applications use modified approaches or combinations of these methods. In case of RS data sets, three different fusions such as fusion of optical data with optical data, fusion of microwave data with microwave data and fusion of optical and microwave data sets can be conducted. For several decades, fusion of multiresolution optical images has been successfully used for the improvement of information contents of images for visual interpretation as well as for the enhancement of land surface features. Many studies have been conducted on the improvement of spatial resolution of multispectral images by the use of the high frequencies of panchromatic images, while preserving the spectral information (Mascarenhas et al. 1996, Saraf 1999, Teoh et al. 2001, Teggi et al. 2003, Gonzalez et al. 2004, Colditz et al. 2006, Deng et al. 2008, Li and Leung 2009). A number of authors have attempted to successfully fuse the interferometric or multifrequency SAR images (Soh and Tsatsoulis 1999, Verbyla 2001, Baghdadi et al. 2002, Costa 2005, Palubinskas and Datcu 2008). Unlike the fusion of optical images, most fusions of the synthetic aperture radar (SAR) data sets have attempted to increase the spectral variety of the classes. Over the years, the fusion of optical and SAR data sets has been widely used for different applications. It has been found that the images acquired at optical and microwave ranges of electro-magnetic spectrum provide unique information when they are integrated

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call