Abstract

The fusion of a hyperspectral image (HSI) and multispectral image (MSI) can significantly improve the ability of ground target recognition and identification. The quality of spatial information and the fidelity of spectral information are normally contradictory. However, these two properties are non-negligible indicators for multi-source remote-sensing images fusion. The smoothing filter-based intensity modulation (SFIM) method is a simple yet effective model for image fusion, which can improve the spatial texture details of the image well, and maintain the spectral characteristics of the image significantly. However, traditional SFIM has a poor effect for edge information sharpening, leading to a bad overall fusion result. In order to obtain better spatial information, a spatial filter-based improved LSE-SFIM algorithm is proposed in this paper. Firstly, the least square estimation (LSE) algorithm is combined with SFIM, which can effectively improve the spatial information quality of the fused image. At the same time, in order to better maintain the spatial information, four spatial filters (mean, median, nearest and bilinear) are used for the simulated MSI image to extract fine spatial information. Six quality indexes are used to compare the performance of different algorithms, and the experimental results demonstrate that the LSE-SFIM based on bilinear (LES-SFIM-B) performs significantly better than the traditional SFIM algorithm and other spatially enhanced LSE-SFIM algorithms proposed in this paper. Furthermore, LSE-SFIM-B could also obtain similar performance compared with three state-of-the-art HSI-MSI fusion algorithms (CNMF, HySure, and FUSE), while the computing time is much shorter.

Highlights

  • In recent years, a large number of remote-sensing satellites have been launched continuously with the development of Earth observation technology [1,2]

  • Multi-sensor data fusion has arisen at an historic moment, which can effectively explore the complementary information from multi-platform observations, making land surface monitoring more accurate and comprehensive

  • In order to evaluate the performance of the fusion method objectively and quantitatively, we use low-spatial resolution hyperspectral images obtained from real data resampling in the spatial domains, and high-spatial resolution multispectral images obtained in the spectral domains to carry out simulation data experiments

Read more

Summary

Introduction

A large number of remote-sensing satellites have been launched continuously with the development of Earth observation technology [1,2]. The continuous development of remote-sensing applications such as geological exploration [6], resource and environmental investigation [7–9], agricultural monitoring [10–12], urban planning [13–16], etc., has greatly promoted the demand for remote-sensing data and the improvement of the performance of satellite sensors. Multi-sensor data fusion has arisen at an historic moment, which can effectively explore the complementary information from multi-platform observations, making land surface monitoring more accurate and comprehensive. Multi-source remote-sensing data fusion refers to the processing of multi-source data with complementary information in time or space according to certain rules, so as to obtain a more accurate and informative composite images than any single data source

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call