Abstract

Fusion of information from infrared and visible images from different modalities has always been a hot research topic in the field of image processing. In order to solve the problem that existing fusion methods cannot fully utilize the deep information of images, this paper proposes a fusion algorithm DAE-Nest based on depth information extraction and enhancement. The algorithm consists of encoder-feature enhancement-fusion layer-decoder modules for unsupervised learning. At first, the improved coding structure is used to extract multi-scale features of the source image; secondly, the extracted features are data enhanced to fully utilize the deep image information; then, the enhanced features are fused according to the corresponding fusion strategy; finally, the decoder takes the input of fused features and reconstructs them into an information-rich fused image. DAE-Nest was subjected to corresponding fusion test experiments on the TNO and RoadScene data sets respectively and compared with 9 excellent fusion algorithms. In terms of subjective evaluation, DAE-Nest can make full use of the deep information in the source image, and the generated fusion image also conforms to the visual perception of the human eyes; in terms of objective evaluation indicators, DAE-Nest's average gradient and spatial frequency on the TNO data set, it achieved the best performance in the three evaluation indicators of structural similarity and structural similarity, and also achieved the best performance in the fusion quality evaluation index on the RoadScene data set. Therefore, the fusion method proposed in this paper has achieved excellent results both subjectively and objectively, and can effectively fuse infrared and visible images, laying a good foundation for subsequent visual tasks.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.