Abstract

Abstract. The practical use of very high resolution visible and near-infrared (VNIR) data is still growing (IKONOS, Quickbird, GeoEye-1, etc.) but for classification purposes the number of bands is limited in comparison to full spectral imaging. These limitations may lead to the confusion of materials such as different roofs, pavements, roads, etc. and therefore may provide wrong interpretation and use of classification products. Employment of hyperspectral data is another solution, but their low spatial resolution (comparing to multispectral data) restrict their usage for many applications. Another improvement can be achieved by fusion approaches of multisensory data since this may increase the quality of scene classification. Integration of Synthetic Aperture Radar (SAR) and optical data is widely performed for automatic classification, interpretation, and change detection. In this paper we present an approach for very high resolution SAR and multispectral data fusion for automatic classification in urban areas. Single polarization TerraSAR-X (SpotLight mode) and multispectral data are integrated using the INFOFUSE framework, consisting of feature extraction (information fission), unsupervised clustering (data representation on a finite domain and dimensionality reduction), and data aggregation (Bayesian or neural network). This framework allows a relevant way of multisource data combination following consensus theory. The classification is not influenced by the limitations of dimensionality, and the calculation complexity primarily depends on the step of dimensionality reduction. Fusion of single polarization TerraSAR-X, WorldView-2 (VNIR or full set), and Digital Surface Model (DSM) data allow for different types of urban objects to be classified into predefined classes of interest with increased accuracy. The comparison to classification results of WorldView-2 multispectral data (8 spectral bands) is provided and the numerical evaluation of the method in comparison to other established methods illustrates the advantage in the classification accuracy for many classes such as buildings, low vegetation, sport objects, forest, roads, rail roads, etc.

Highlights

  • AVAILABILITY of high and very high spatial resolution multisensory data opens new perspectives for processing, recognition and decision making in urban areas containing a variety of objects and structures

  • For single sensor data (VNIR, WV-2, WV-2+Digital Surface Model (DSM)) fusion and classification using INFOFUSE 100 clusters were used for each feature

  • The best accuracy of the classification provided by INFOFUSE and Neural Network (NN) methods on the combination of the multispectral data, Gabor texture features are acquired both on the TSX and optical band and the DSM data

Read more

Summary

INTRODUCTION

AVAILABILITY of high and very high spatial resolution multisensory data opens new perspectives for processing, recognition and decision making in urban areas containing a variety of objects and structures. Limited spectral range covered by the multispectral sensors does not allow to obtain high accuracy of thematic classification as well as relatively high number of classes. Different modalities and different types of digital data (e.g. multispectral, SAR, Digital Elevetion Model (DEM), Geographic information system (GIS), vector maps, etc.) allow significant increase of the accuracy of automatic recognition and interpretation for urban areas only in the case when a correct fusion methodology is used. Several fusion methodologies following consensus theory (Benediktsson et al, 1997) were developed and successfully used (Pacifici et al, 2008, Fauvel et al, 2006, Rottensteiner et al, 2004) but still the number of thematic classes is low. The overall accuracy of classification for 6 classes (Large buildings, Houses, Large roads, Streets, Open areas, and Shadows) is 75.7 %

PROPOSED FUSION MODEL
Fusion strategies and classification
Feature extraction
RESULTS AND DISCUSSION
Method
CONCLUSIONS
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call