The relevance of artificial intelligence (AI) systems is growing every year. AI is being introduced into various fields of activity. One of the main technologies used in AI is artificial neural networks (hereinafter referred to as NN). With the help of neural networks, a huge class of problems is solved, such as classification, regression, autoregression, clustering, noise reduction, creating a vector representation of objects, and others. In this work, we consider the simplest case of operation of one neuron with the Heaviside activation function, we also consider fast ways to train it, and we reduce the learning problem to the problem of finding the normal vector to the separating hyperplane and the displacement weight. One of the promising areas for training NN is non-iterative training, especially in the context of processing and analyzing high-dimensional data. This article discusses a method of non-iterative learning, which allows you to greatly (by 1-2 orders of magnitude) speed up the training of one neuron. The peculiarity of the approach is to determine the hyperplane separating two classes of objects in the feature space, without the need for repeated recalculation of weights, which is typical for traditional iterative methods. Within the framework of the study, special attention is paid to cases when the main axes of the ellipsoids describing the classes are parallel. The function pln is defined to calculate the distances between objects and the centers of their classes, based on which the non-normalized normal vector to the hyperplane and the displacement weight are calculated. In addition, we provide a comparison of our method with support vector machines and logistic regression.
Read full abstract