Abstract

Due to the high-dimensional characteristic of hyperspectral images, dimensionality reduction (DR) is an important preprocessing step for classification. Recently, sparse and low-rank graph-based discriminant analysis (SLGDA) has been developed for DR of hyperspectral images, for which the properties of sparsity and low-rankness are simultaneously exploited to capture both local and global structures. However, SLGDA may not achieve satisfactory results when handling complex data with nonlinear nature. To address this problem, this paper presents two kernel extensions of SLGDA. In the first proposed classical kernel SLGDA ( ${c}$ KSLGDA), the kernel trick is exploited to implicitly map the original data into a high-dimensional space. With a totally different perspective, we further propose a Nystrom-based kernel SLGDA ( ${n}$ KSLGDA) by constructing a virtual kernel space by the Nystrom method, in which virtual samples can be explicitly obtained from the original data. Both ${c}$ KSLGDA and ${n}$ KSLGDA can achieve more informative graphs than SLGDA, and offer superiority over other state-of-the-art DR methods. More importantly, the ${n}$ KSLGDA can outperform ${c}$ KSLGDA with much lower computational cost.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call