Abstract

Currently, semi-supervised twin support vector machine based on Laplacian regularization (LapTSVM) have received extensive attention and research in many fields of machine learning. Unfortunately, Laplacian regularization has a constant null space, so the solution is often a constant function and cannot well maintain the local topology of the samples. Aiming the above urgent problems, this paper, we first construct a Hessian scatter regularization (HSR) term. HSR has two major advantages: (1) HSR prefers linear variation in function values along of the geodesic distance and maintains the local manifold structure of the samples well. (2) HSR tries to find the projection from the original space to the feature space to maximize the inter-class scatter and minimize the intra-class scatter of the samples; the scatter is regarded as the discriminative information (structural information) of samples. Secondly, by introducing HSR, we propose a Hessian scatter regularized twin support vector machine (HSR-TSVM). Compared with LapTSVM, HSR-TSVM uses the global and local structure information of the sample to overcome the shortcomings of insufficient extrapolation caused by Laplacian regularization, while retaining almost all the advantages of the classic LapTSVM. Furthermore, to improve the computational efficiency of HSR-TSVM, the least-squares version of HSR-TSVM, namely HSR-LSTSVM, is proposed, and the conjugate gradient method is used to solve it. Experimental results on four synthetic datasets, ten UCI datasets, and four image datasets show that the proposed methods are competitive with semi-supervised learning methods based on Laplacian regularization.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call