In this paper, a novel de-ghosting image fusion technique is presented, which enhances the quality of low dynamic range images using multi-level exposures taken from the ordinary camera and also removes the ghosting artifact. In the proposed algorithm, first, the source images, taken under different exposure settings, are decomposed into base and detail layers using two-scale decomposition. The base and detail layers contain small and large-scale variation details of the source images, respectively. The Laplacian-of-Gaussian filter is applied to the source images to get the edge information. Afterward, the saliency map of the edges is computed. To remove the ghosting artifacts, a weight matrix is calculated by applying the median filter on the histogram equalized source images. The weight matrix is combined with the saliency map to generate more accurate weights. The separate weights for the base and detail layers are calculated using guided image filters. Finally, the base and detail layers’ weights are fused with the source images to generate a vivid and enhanced image without any artifacts. The proposed technique is evaluated both qualitatively and quantitatively. The comparison of our technique in terms of Yang’s Metric ( $Q_{Y}$ ), Quality Mutual Information ( $Q_{MI}$ ), Gradient-based Fusion Metric ( $Q_{G}$ ) and Chen Blum’s Metric ( $Q_{CB}$ ) with other state-of-the-art techniques proves that the proposed technique outperforms existing techniques.
Read full abstract