Abstract

Numerous algorithms have been proposed to fuse multi-sensor images for a diverse range of applications. The fused result is also presented as an image, which is more informative than any individual input. Accordingly, a number of fusion metrics are developed to assess the performance of the fusion operation or the quality of the fused image. Among these metrics, two approaches, which are based on human visual system models, employ the contrast sensitivity function (CSF) in its implementations. In CSF, viewing distance is a crucial parameter determining the filtering bandwidth. However, the effect of viewing distance was not discussed in the available literature. This letter is to clarify the use of CSF in the human perception-based fusion metrics. The experimental results illustrate how the viewing distance affects the fusion quality assessment and Matlab™ code is provided in this letter.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.