Abstract

The task of the analysis of hyperspectral data, due to their high spectrla reolution, requires dealing with the problem of the curse of dimenioality. Many feature selection/extraction techniques have been developed, which map the hyperdimensional feature space in a lower-dimensional space, based on the optimization of a suitable criterion function. This paper studies the impact of several such techniques and of the criterion chosen on the accuracy of different supervised classifiers. The compared methods are the 'Sequential Forward Selection' (SFS), the 'Steepest Ascent' (SA), the 'Fast Constrained Search' (FCS), the 'Projection Pursuit' (PP) and the 'Decision Boundary Feature Extraction' (DBFE), while the considered criterion functions are standard interclass distance measures. SFS is well known for its conceptual and computational simplicity. SA provides more effective subsets of selected features at the price of a higher computational cost. DBFE is an effective transformation technque, usually applied after a preliminary feature-space reduction through PP. The experimental comparison is performed on an AVIRIS hyperspectral data set characterized by 220 spectral bands and nine ground cover classes. The computational time of each algorithm is also reported.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.