Abstract

The article considers a new approach to constructing a support vector machine with semi-supervised learning for solving a classification problem. It is assumed that the distributions of the classes may overlap. The cost function has been modified by adding elements of a penalty to it for labels not in their class. The penalty is represented as a linear function of the distance between the label and the class boundary. To overcome the problem of multicriteria, a global optimization method known as continuation is proposed. For a combination of predictions, it is suggested to use the voting method of models with different kernels. The Optuna framework was chosen as the tool for configuring hyperparameters. The following were considered as training samples: type_dataset, banana, banana_inverse, c_circles, two_moons_classic, two_moons_tight, two_moons_wide.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call