Abstract

KNN classification is an improvisational learning mode, in which they are carried out only when a test data is predicted that set a suitable K value and search the K nearest neighbors from the whole training sample space, referred them to the lazy part of KNN classification. This lazy part has been the bottleneck problem of applying KNN classification due to the complete search of K nearest neighbors. In this paper, a one-step computation is proposed to replace the lazy part of KNN classification. The one-step computation actually transforms the lazy part to a matrix computation as follows. Given a test data, training samples are first applied to fit the test data with the least squares loss function. And then, a relationship matrix is generated by weighting all training samples according to their influence on the test data. Finally, a group lasso is employed to perform sparse learning of the relationship matrix. In this way, setting K value and searching K nearest neighbors are both integrated to a unified computation. In addition, a new classification rule is proposed for improving the performance of one-step KNN classification. The proposed approach is experimentally evaluated, and demonstrated that the one-step KNN classification is efficient and promising

Highlights

  • K NN classification is one of top 10 algorithms in data mining [1]

  • From the learning procedure, designing a KNN classification algorithm is almost independent of the training data, i.e., it does not train anything with the training dataset before a query is submitted

  • A relationship matrix is generated by weighting all training samples according to their influence on the test data

Read more

Summary

A SUBMISSION OF IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING

Abstract—KNN classification is an improvisational learning mode, in which they are carried out only when a test data is predicted that set a suitable K value and search the K nearest neighbors from the whole training sample space, referred them to the lazy part of KNN classification. This lazy part has been the bottleneck problem of applying KNN classification due to the complete search of K nearest neighbors. A group lasso is employed to perform sparse learning of the relationship matrix In this way, setting K value and searching K nearest neighbors are both integrated to a unified computation. The proposed approach is experimentally evaluated, and demonstrated that the one-step KNN classification is efficient and promising

INTRODUCTION
K value calculation
Neighbor search
Notation
Framework
Convergence analysis
Datasets and Settings
Comparied Algorithms
Experimental setting
Parameter sensitivity
Running cost analysis
Findings
CONCLUSION
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call