Abstract

The visible (VS) and near-infrared (NIR) image fusion is a common approach to improve image visibility, which saves rich scene details and similar colors to the VS image in fused image. However, preserving edge details and preventing color distortion is a fundamental yet challenging problem for VS-NIR fusion work. To address this problem, this article proposes a novel image fusion method based on multiscale gradient guided edge-smoothing (MGES) model and local gradient weight. According to the spectrum characteristics of VS and NIR, the local gradient weight is established by analyzing the local gradient difference of VS and NIR, aiming to only transfer the prominent details of NIR image to VS image, thereby avoiding the confusion of spectral information. Furthermore, the MGES model is designed to simultaneously generate a Laplacian pyramid and a gradient domain guided filtering-based weight pyramid, which fully considers the gradient correlation between neighboring pixels and omits the Gaussian filtering step, thus effectively preserving spatial details and suppressing halo artifacts. Subjective and objective experimental results demonstrate the superiority of the proposed method over state-of-the-art methods in terms of preserving edge details and maintaining color naturalness.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call