Abstract
The dynamic range captured by ordinary digital cameras is much smaller than that in natural scenes, so a single image cannot fully capture the real scene details when shooting high dynamic scenes. Traditional multi-exposure image fusion algorithms often cause inconsistencies in local brightness and corresponding scenes, which easily leads to problems such as color distortion and loss of details. Based on this, a novel multi-scale exposure fusion approach based on physical features is presented in this paper. Specifically, we first employ a novel Retinex model to get the illumination maps of original images and adopt the obtained illuminations map to construct the exposure maps. Then, a feature descriptor named patterns of oriented edge magnitudes (POEM) is introduced to extract local contrast from source images. Moreover, the extracted feature is combined to construct a weight map, and the weight term is integrated with the Laplacian pyramid algorithm to obtain an initial fusion image. Finally, simple detail enhancement and color compensation operations are performed to get the final fusion result. The numerous experiments have revealed that the proposed approach yields comparative and even better results in comparison with some state-of-the-art techniques both in subjective and objective evaluation.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.