Abstract

To better integrate complementary and redundant information from different source images, improve the edge information, and facilitate the target detection. A multi-scale fusion algorithm of intensity and polarization-difference (PD) images based on edge information enhancement is proposed. Firstly, intensity images are obtained by the polarization information analysis method. PD images are obtained by the adaptive polarization-difference imaging approach based on the principle of minimum mutual information. Secondly, guided filter, affine transformations and Block-Matching and 3D filtering are embedded in visibility enhancement to improve the intensity and PD images. Thirdly, the two images are decomposed into high-frequency and low-frequency images by the dual-tree complex wavelet transform (DT-CWT). The high-frequency and low-frequency images are fused by the fusion rules based on edge detection and the regional variance matching degree respectively. Finally, the fusion image is obtained by the inverse DT-CWT. Experimental results demonstrate that fusion images of the proposed algorithm are significantly improved in information entropy, average gradient, and spatial frequency. Compared with the existing methods, it can achieve a better edge enhancement for images in a turbid medium.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call