Abstract
In recent years, many techniques of fusion of multi-sensors satellite images have been developed. This article focuses on examining and improvement the usability of pansharpened images for object detection, especially when fusing data with a high GSD ratio. A methodology to improve an interpretative ability of pansharpening results is based on pre-processing of the panchromatic image using Logarithmic-Laplace filtration. The proposed approach was used to examine several different pansharpening methods and data sets with different spatial resolution ratios, i.e., from 1:4 to 1:60. The obtained results showed that the proposed approach significantly improves an object detection of fused images, especially for imagery data with a high-resolution ratio. The interpretative ability was assessed using qualitative method (based on image segmentation) and quantitative method (using an indicator based on the Speeded Up Robust Features (SURF) detector). In the case of combining data acquired with the same sensor the interpretative potential had improved by a dozen or so per cent. However, for data with a high resolution ratio, the improvement was several dozen, or even several hundred per cents, in the case of images blurred after pansharpening by the classic method (with original panchromatic image). Image segmentation showed that it is possible to recognize narrow objects that were originally blurred and difficult to identify. In addition, for panchromatic images acquired by WorldView-2, the proposed approach improved not only object detection but also the spectral quality of the fused image.
Highlights
The development of image acquisition techniques expands the possibilities of their application.Imagery obtained from various altitudes are used for multiple analysis in many fields of science and technology, such as urban planning and the environmental monitoring [1,2], archeology [3], land-use and landcover mapping [4,5].Each image is described by its resolution; the most important in remote sensing are spectral and spatial resolution
We focus on exploring the usability of various pansharpened images for object detection, especially by automated algorithms that are based on detectors of characteristic points in the image
Our approach is inspired by the mentioned research but has a different purpose. In this manuscript we propose a methodology for pre-processing a panchromatic image that aims at improving the interpretative ability of data fusion with a standard GSD ratio but above all, it is dedicated to non-standard data
Summary
Each image is described by its resolution; the most important in remote sensing are spectral and spatial resolution. The spectral resolution depends on the number and width of the spectral ranges in which the image is acquired. The spatial resolution is defined by the distances of the neighbouring pixels–Ground Sampling Distance (GSD) [6]. Availability of a variety of imagery data affects attempts of their integration. Pansharpening—the process of combining a high-resolution panchromatic image (PAN) with a low-resolution multispectral image (MS)—allows one to obtain images with both high spatial and spectral resolution. The integration of the images is performed for data acquired by sensors mounted on the same platform for which the GSD ratio ranges from 1:2 to 1:5 [7] ( on this ratio range is referred as the standard one). There are many approaches to the integration of such data, from which two main groups can be distinguished: component substitution and multiresolution
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.