Abstract

Several spectral unmixing techniques using multiple endmembers for each class have been developed. Although they can address within-class spectral variability, their unmixing results may have low unmixing resolution when the within-class variation is large due to the associated high uncertainty. Therefore, it is critical to represent data in an effective feature space so that the endmember classes are compact with small variation. In this letter, a minimum-class-variance support vector machine (MCVSVM) is further developed to extend its functions for both classification and spectral unmixing. Moreover, analytical expressions for spectral unmixing resolution (SUR) are provided to measure the spectral unmixing uncertainty in the new feature space. The extended MCVSVM (e_MCVSVM) can improve SUR and reduce the spectral unmixing uncertainty as it can effectively maximize the between-class scatter while minimizing the within-class scatter. Experimental results show that the e_MCVSVM algorithm performs better in terms of the unmixing accuracy and the computation speed compared with the other algorithms (e.g., fully constrained least squares and endmember bundles) in both linearly separable and nonseparable cases. This newly proposed approach advances the linear spectral mixture analysis with greater speed and higher accuracy based on the SVM after the SUR is effectively characterized.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call