Abstract

AbstractIdentifying saliency in high dynamic range (HDR) images is a fundamentally important issue in HDR imaging, and plays critical roles towards comprehensive scene understanding. Most of existing studies leverage hand‐crafted features for HDR image saliency prediction, lacking the capabilities of fully exploiting the characteristics of HDR image (i.e. wider luminance range and richer colour gamut). Here, systematical studies are carried out on HDR image saliency prediction by proposing a new framework to single out the contributions from multi‐exposure images. Specifically, inspired by the mechanism of HDR imaging, the method first utilizes graph neural networks to model the relations among multi‐exposure images and the tone‐mapped image obtained from an HDR image, enabling more discriminative saliency‐related feature representations. Subsequently, the saliency features driven by global semantic knowledge are aggregated from the tone‐mapped image through enhancing global context‐aware semantic information. Finally, a fusion module is designed to integrate saliency‐oriented feature representations originated from multi‐exposure images and the tone‐mapped image, producing the saliency maps of HDR images. Moreover, a new challenging HDR eye fixation database (HDR‐EYEFix) is created, expecting to further contribute the research on HDR image saliency prediction. Experiment results show that the method obtains superior performance compared to the state‐of‐the‐art methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.