Abstract
The aim of the following study was to develop a procedure which guarantees the data fusion of thermal and visual images. The first stage of the proposed algorithm consisted of images acquisition which guaranteed that the same parts of images represented the same parts of the observed terrain. The second stage depended on previous information about the searched object features. Two different situations were considered herein. In the case when we had the searched object’s feature vector for both representations of a searched object, we could conduct the pattern recognition for each image. It was conducted separately for visual and thermal images. In this way, we obtained the important parts of the images which should be represented in a fused image. The other case examined in the paper, considered the situation in which we did not have the formalised information about the object. In this case, it was necessary to analyse whole images in order to define the potential parts of the images where the object could be found. This analysis should be helpful for an operator to indicate the parts of the images where there are some artefacts which can be the elements of the searched object. Therefore, in this case, the second stage of the algorithm consisted in calculating the local features of the images. These features constituted grey scale gradient computed for the pixels inside the aperture. This study presented the examples of the fused images obtained by means of the developed method.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.