Abstract

Infrared (IR) and visible images are heterogeneous data, and their fusion is one of the important research contents in the remote sensing field. In the last decade, deep networks have been widely used in image fusion due to their ability to preserve high-level semantic information. However, due to the lower resolution of IR images, deep learning-based methods may not be able to retain the salient features of IR images. In this article, a novel IR and visible image fusion based on IR Features & Multiscale Dense Network (IR-MSDNet) is proposed to preserve the content and key target features from both visible and IR images in the fused image. It comprises an encoder, a multiscale decoder, a traditional processing unit, and a fused unit, and can capture incredibly rich background details in visible images and prominent target details in IR features. When the dense and multiscale features are fused, the background details are obtained by utilizing attention strategy, and then combined with complimentary edge features. While IR features are extracted by traditional quadtree decomposition and Bezier interpolation, and further intensified by refinement. Finally, both the decoded multiscale features and IR features are used to reconstruct the final fused image. Experimental evaluation with other state-of-the-art fusion methods validates the superiority of our proposed IR-MSDNet in both subjective and objective evaluation metrics. Additional objective evaluation conducted on the object detection (OD) task further verifies that the proposed IR-MSDNet has greatly enhanced the details in the fused images, which bring the best OD results.

Highlights

  • R EMOTE sensing image fusion has been studied for decades, because complementary information from multisource remote sensing images in the fused image is of great help to various remote sensing applications such as surveillance, object detection (OD), etc. [1], [2]

  • One process is to construct an encoder and decoder network to realize the initial fusion of visible and IR images, the other is to further fuse IR image features extracted by the traditional methods with the initial fusion image to compensate for the loss of IR image details caused by convolutional neural networks (CNN)

  • For objective evaluation, seven quantitative quality metrics are selected: entropy (En) [34]; Qabf [35] showing the quality of visual evidence found from the fusion image; FMIw and FMIdct [36] computing fast mutual information (FMI); a modified structural similarity SSIMa [37]; MS-SSIM [38] computing a modified structural similarity which only emphases on structural information, and to further analyses the quality of the fused image, and the standard deviation (SD) [39], which are utilized as quality metrics

Read more

Summary

INTRODUCTION

R EMOTE sensing image fusion has been studied for decades, because complementary information from multisource remote sensing images in the fused image is of great help to various remote sensing applications such as surveillance, object detection (OD), etc. [1], [2]. IR features need to be further enhanced for image reconstruction after To overcome these above drawbacks, a novel IR and visible image fusion based on IR Features & Multiscale Dense Network (IR-MSDNet) is proposed, which makes full use of respective advantages of deep learning and traditional handcrafted feature, especially IR feature extraction, to obtain a better fusion result. It preserves full background details and key target features from both visible and IR images. The fusion rules adopted are relatively complex, there are inevitably problems such as low efficiency and high computational cost

Deep Learning Based Fusion Methods
Traditional Fusion Methods
Encoder
Multiscale Decoder
Training Encoder and Decoder Networks
Traditional Processing
Data Fusion
EXPERIMENTS AND ANALYSIS
Datasets
Related Parameters
Comparison and Analysis of Objective Evaluations
METHODS
Objective Evaluations on OD
CONCLUSION
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call