Abstract
Classification of remotely sensed hyperspectral images calls for a classifier that gracefully handles high-dimensional data, where the amount of samples available for training might be very low relative to the dimension. Even when using simple parametric classifiers such as the Gaussian maximum-likelihood rule, the large number of bands leads to copious amounts of parameters to estimate. Most of these parameters are measures of correlations between features. The covariance structure of a multivariate normal population can be simplified by setting elements of the inverse covariance matrix to zero. Well-known results from time series analysis relates the estimation of the inverse covariance matrix to a sequence of regressions by using the Cholesky decomposition. We observe that discriminant analysis can be performed without inverting the covariance matrix. We propose defining a sparsity pattern on the lower triangular matrix resulting from the Cholesky decomposition, and develop a simple search algorithm for choosing this sparsity. The resulting classifier is used on four different hyperspectral images, and compared with conventional approaches such as support vector machines, with encouraging results
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Geoscience and Remote Sensing
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.