Abstract

Semi-Supervised Learning (SSL) has attracted much attention in the field of machine learning and data mining. As an extension of Support Vector Machine (SVM), the Semi-Supervised Support Vector Machine (S3VM) was proposed for SSL. Recent studies have disclosed that optimising the margin distribution is more crucial than maximising the minimum margin in generating a better classification. However, the existing S3VM models still follow the idea of maximising the minimum margin. Therefore, this paper proposes a novel Laplacian Large margin Distribution Machine (LapLDM) for SSL to enhance the classification performance. This method can optimise the margin distribution by maximising the first-order (margin mean) and minimising the second-order (margin variance) statistics of margins, and exploit the geometry information of marginal distribution embedded in the unlabelled data through the Laplacian regularizer. Then this paper develops a Preconditioned Conjugate Gradient (PCG) algorithm to solve the nonlinear LapLDM model on those regular-scaled data sets and a Stochastic Gradient Descent with Variance Reduction (SVRG) algorithm to solve the linear LapLDM model on those large-scaled data sets. These algorithms can accelerate the implementing efficiencies of proposed models and make them available for those large-scaled problems. Finally, the numerical results on four artificial and fourteen public benchmark data sets demonstrate that the LapLDM is superior to some well-known S3VM models.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call