Abstract

Current satellite remote-sensing systems compromise between spatial resolution and spectral and/or temporal resolution, which potentially limits the use of remotely sensed data in various applications. Image fusion processes, including spatial and spectral fusion (SSF) and spatial and temporal fusion (STF), provide powerful tools for addressing these technological limitations. Although SSF and STF have been extensively studied separately, they have not yet been integrated into a unified framework to generate synthetic satellite images with high spatial, temporal and spectral resolution. By formulating these two types of fusion into one general problem, i.e. super resolving a low spatial resolution image with a high spatial resolution image acquired under different conditions (e.g. at different times and/or in different acquisition bands), this letter proposes a notion of unified fusion that can accomplish both SSF and STF in one process. A Bayesian framework is subsequently developed to implement SSF, STF and unified fusion to generate ‘virtual sensor’ data, characterized by high spatial, temporal and spectral resolution simultaneously. The proposed method was then applied to the fusion of Moderate Resolution Imaging Spectroradiometer (MODIS) and Landsat Enhanced Thematic Mapper Plus (ETM+) images of the Hong Kong area, with the average spatial correlation coefficient exceeding 0.9 for near infrared–red–green bands between the fused result and the input Landsat image and with good preservation of the MODIS spectral properties.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call