Abstract

Due to the characteristics of the diversity and complement of multi-modality images, the fused method in SPCNN of adaptive parameters based on inherent characteristics of image is proposed in this paper, in order to extract the optimal feature information from the different images. Based on the analysis of traditional pulse coupled neural network, the simplified PCNN model based on SCM is designed. In order to highlight the different feature of different modality images, the adaptive parameters of SPCNN are set on the basis of the inherent characteristics of images, and the fused method in NSST is proposed. Firstly, the source image is decomposed by NSST with multiple scales and multiple directions. In the high-frequency coefficients, the SPCNN with adaptive parameters is adopted to calculate the firing time of each coefficient, the high-frequency coefficients are fused by comparing the firing times. In order to protect the details and edge information of the low-frequency coefficients, the combining of regional energy and energy of gradient is regarded as the active level, the low-frequency coefficients is fused by the weighted fusion rule. Finally, the fused image is reconstructed by inverse NSST transform. The experimental results show that the proposed method is superior to the other five classical methods, the fused results of the proposed method are consistence in the human visual perception system. The contrast and detail of fused image are perfect.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.