Abstract

Most studies on linear k-nearest neighbor methods assume independent and identically distributed training and testing samples. However, this is not the case in many practical applications, particularly when each class shares a homogeneous style. We propose a novel style linear k-nearest neighbor method to achieve two goals: (a) the linear k-nearest neighbor method should mine the stylistic data for explicit or implicit stylistic features while obtaining linear expressions and effectively transfer them to the predictor through matrix expressions; (b) the similarity of stylistic features between testing and training samples should be numerically quantified to enhance the generalizability of the predictor. To this end, firstly we introduce style matrices to express the style information of each class. In order to prevent the style matrices from degenerating into identity matrices, we introduce enhanced nodes to separate the manifold structure of the original data. The dual representation of matrix and enhanced nodes makes it easy to extract stylistic features from data. Furthermore, we introduce the style membership vector for the first time in the linear k-nearest neighbor method to calculate the style similarity of testing samples to each class, and determine the labels of testing samples easily and accurately through the style membership vector. We also propose an implementable alternating optimization strategy for the proposed method, which decomposes the complex optimization problem into independent subproblems for easy implementation. The experimental results demonstrate that the proposed method keeps comparable generalization capability on the total 9 ordinary datasets. And it outperforms the comparative methods, and even achieving 9.76 % improvement of average testing accuracy over the state-of-the-art weighted locally linear k-nearest neighbor method on the total 6 stylistic datasets.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call