Abstract

The use of OBIA for high spatial resolution image classification can be divided in two main steps, the first being segmentation and the second regarding the labeling of the objects in accordance with a particular set of features and a classifier. Decision trees are often used to represent human knowledge in the latter. The issue falls in how to select a smaller amount of features from a feature space with spatial, spectral and textural variables to describe the classes of interest, which engenders the matter of choosing the best or more convenient feature selection (FS) method. In this work, an approach for FS within a decision tree was introduced using a single perceptron and the Backpropagation algorithm. Three alternatives were compared: single, double and multiple inputs, using a sequential backward search (SBS). Test regions were used to evaluate the efficiency of the proposed methods. Results showed that it is possible to use a single perceptron in each node, with an overall accuracy (OA) between 77.6% and 77.9%. Only SBS reached an OA larger than 88%. Thus, the quality of the proposed solution depends on the number of input features.

Highlights

  • The increasing development of multi and hyperspectral sensors, as well as the object-based image analysis techniques for classifying high spatial resolution satellite imagery, led to a large amount of data, as illustrated by the substantial set of features available to describe classes of interest in the classification step

  • The perceptron concept was applied in each node to select the best features to distinguish the classes; the set of features for each node is displayed in the second column of Table 3

  • Three feature selection approaches based on the perceptron concept were compared

Read more

Summary

Introduction

The increasing development of multi and hyperspectral sensors, as well as the object-based image analysis techniques for classifying high spatial resolution satellite imagery, led to a large amount of data, as illustrated by the substantial set of features available to describe classes of interest in the classification step. As claimed by Haertel and Landgrebe (1999), a high dimensional feature space might cause problems in the estimation of the classes’ covariance matrices. When dealing with a parametric classifier, as the feature space dimensionality increases, so does the number of samples required to provide a reliable estimate of the covariance matrix, which is known as the Hughes phenomenon. Decreasing the number of descriptors in the feature space can reduce the computational cost, for it requires less storage capacity

Methods
Results
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.