Abstract

Infrared and visible image fused(IVIF) results normally suffer from detail loss, noise occurrence, low contrast, and blurred edges. In this paper, a new method is proposed to address the detail loss, low contrast, and blurring issue in IVIF. Specifically, visible images are enhanced by guided filter and high dynamic range compression. Infrared images are normalized by a linear transformation. Then we use blur and clear discrimination to detect salient pixels between infrared and visible images. A fully weight shared multi-path residual neural network is proposed for blur discrimination between infrared and visible image pixels in the same position. Clear pixels are treated as salient pixels, which contribute more to fused images than blur pixels. The output of our proposed network is a binary classification map for blur and clear discrimination, which is treated as our fusion weight map in the fusion stage. To deal with the discontinuous problem, we compute the two distance-transformed maps of the binary classification map and its complementary map. The two distance-transformed maps are used as weight maps to fuse the enhanced infrared and visible images. Finally, we use single scale retinex (SSR) to further enhance our fused images. The experimental results in public IVIF datasets demonstrate the superior performance of our proposed approach over other state-of-the-art methods in terms of both subjective visual quality and objective metrics. The source code is available in https://github.com/eyob12/Multi_path_residual_neural_network_based_IVIF

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.