Abstract

AbstractThis paper proposes an efficient k Nearest Neighbors (kNN) method based on a graph sparse reconstruction framework, called Graph Sparse kNN (GS-kNN for short) algorithm. We first design a reconstruction process between training and test samples to obtain the k value of kNN algorithm for each test sample. We then apply the varied kNN algorithm (i.e., GS-kNN) for the learning tasks, such as classification, regression, and missing value imputation. In the reconstruction process, we employ a least square loss function for achieving the minimal reconstruction error, use an ℓ1-norm to generate different k values of kNN algorithm for different test samples, design an ℓ21-norm to generate the row sparsity for the removal of the impact of noisy training samples, and utilize Locality Preserving Projection (LPP) to preserve the local structures of data. With such an objective function, the GS-kNN obtains the correlation between each test sample and training samples, which then is used to design new classification/regression/missing value imputation rules for real applications. Finally, the proposed GS-kNN method is evaluated with extensive experiments, including classification, regression and missing value imputation, on real datasets, and the experimental results show that the proposed GS-kNN algorithm outperforms the previous kNN algorithms in terms of classification accuracy, correlation coefficient and root mean square error (RMSE).Keywordsk Nearest Neighborssparsereconstruction processLocality Preserving Projection

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call