Abstract
Infrared and visible image fusion is an effective method to solve the lack of single sensor imaging. The purpose is that the fusion images are suitable for human eyes and conducive to the next application and processing. In order to solve the problems of incomplete feature extraction, loss of details, and less samples of common data sets, it is not conducive to training, an end-to-end network architecture for image fusion is proposed. U-net is introduced into image fusion, and the final fusion result is obtained by using the generative adversarial network. Through its special convolution structure, the important feature information is extracted to the maximum extent, and the sample does not need to be cut to avoid the problem of reducing the fusion accuracy, but also to improve the training speed. Then the U-net extracted feature is confronted with the discriminator containing infrared image, and the generator model is obtained. The experimental results show that the present algorithm can obtain the fusion image with clear outline, prominent texture and obvious target. SD, SF, SSIM, AG and other indicators are obviously improved.
Highlights
Infrared and visible image fusion is an effective method to solve the lack of single sensor imaging
In order to solve the problems of incomplete feature extraction, loss of details, and less samples of common data sets, it is not conducive to training, an end⁃to⁃end network architecture for image fusion is proposed
U⁃net is intro⁃ duced into image fusion, and the final fusion result is obtained by using the generative adversarial network
Summary
1.1 U⁃GAN 网络模型 将已配准的红外与可见光图像分别进行输入, 为使融合过程中增加更多的细节及纹理信息,通过 U⁃net 内部特殊“ U” 型卷积结构进行特征的提取,如 图 1 所示。 TNO image fusion dataset 中,选择 30 张图像进 行测试, 从中随机选择 10 组 图 像, 将本文方法与 FusionGAN 结果进行对比。 实验结果如图 4 所示, 从左至右分别为红外图像、可见光图像、FusionGAN 融合结果及本文实验结果。 表 1 TNO Image Fusion dataset 指标值对比 图 4 TNO image fusion dataset 实验结果对比 INO database 在 AG、FD、EI、RSD 的指标值对比 如表 2 所示。
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Xibei Gongye Daxue Xuebao/Journal of Northwestern Polytechnical University
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.