Abstract

Current researches in the infrared and visible image fusion task tend to focus on designing decomposition or fusion schemes for improving the feature preservation performance. However, raw images captured by multi-sensors from extreme conditions may suffer from noise, low contrast and loss of textural and structural details. These influences are difficult to be eliminated by most of the current fusion schemes. In this paper, a robust progressive series-parallel modality feature filtering framework (PSMFF) is proposed to provide a promising solution for 24/7 infrared and visible image integration. After fully considering modality-specific characteristics of infrared and visible images, the first-stage enhancement module is constructed as a dual solution system that combines the pixel-level attention-aware module and fidelity-driven intensity enhancement to overcome the noise and low contrast of source images. Moreover, to further excavate and integrate the latent inherent information of each modality, the second-stage enhancement and fusion Scheme based on the gradient-domain guide filter is proposed to realize a better extraction and aggregation of significant features from the decomposed base and detail coefficients. Notably, the proposed PSMFF method can fulfill the low-light fusion task and preserves more informative edge and texture details to obtain a satisfactory fusion result. Extensive experiments demonstrate that the proposed algorithm is superior to the state-of-the-art methods in the performance of subjective and objective evaluation.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call