Abstract

ABSTRACTMultispectral (MS) and panchromatic (Pan) image fusion, which is used to obtain both high spatial- and spectral-resolution images, plays an important role in many remote-sensing applications such as environmental monitoring, agriculture, and mineral exploration. This article presents an image fusion framework based on the spatial distribution consistency. First, a YUV transform is adopted to separate the luminance component from the colour components of the original MS image. Then, the relationships between the ideal high-resolution multispectral (HRMS) colour components and the Pan band are established based on the spatial distribution consistency, and finally an inverse transform is employed to obtain the fused image. In this article, two types of relationship models are presented. The first model stems from the physical meaning of the assumption and uses a local linear model to describe it. The second model directly uses its algebraic meaning to design the objective cost function and obtains the global optimal solution. The proposed two models are compared with 15 other widely used methods on six real remote-sensing image data sets. Experimental results show that the proposed method outperforms the compared state-of-the-art approaches.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call