Abstract

It is known that the implementation technology of recognition problems, based on the classic neural network, has a number of difficulties such as the need to have a large training set; the duration and complexity of learning algorithms; difficulty with the choice of such network design parameters as the number of neurons, layers, links, as well as ways to connect neurons; there may be no successful learning, with the need to re-change the network settings and re-training. In this paper we consider the possibility of creating a multi-layer perceptron with a full system of connections and with a threshold activation function on the basis of algorithms metric methods of recognition and in particular the nearest neighbor algorithm. It is shown that this method allows you to create a fully connected multilayer perceptron, such parameters of which as the number of neurons, layers, as well as the value of the weights and thresholds, are determined analytically. The distribution of weight and threshold values for the second and third layer is also discussed. On this basis, we have proposed an algorithm for calculating the thresholds and weights of a multilayer perceptron and showed an example of its implementation. The possible applications of the network for different tasks are considered.

Highlights

  • We have proposed an algorithm for calculating the thresholds and weights of a multilayer perceptron and showed an example of its implementation

  • In this paper we consider the possibility of creating a multi-layer perceptron with a full system of connections and with a threshold activation function on the basis of algorithms metric methods of recognition and in particular the nearest neighbor algorithm

Read more

Summary

Эталон N

В этом случае, если результативность (1) полученной НСММР не удовлетворяет условиям задачи, то возможно дообучение полученной НСММР существующими классическими алгоритмами обучения и уменьшение ошибок (2) задачи распознавания, как, например, показано на рисунке 1 на ΔF1, ΔF2, ΔF3, ΔF4, ΔF5, ΔF6, ΔF7. Условие (4) и (8) означают, что при активности всех входов k-го нейрона второго слоя на выходе нейрона будет значение 1, в противном случае — 0. При этом также можно будет показать, что для k-го нейрона второго слоя для всех добавленных связей ( i ≠ k ) с целью соблюдения стабильности работы функции активации (8) необходимо выполнения условий (14, 15):. Условие в (15) следует из того, что при активности всех N-1 нейронов первого слоя для k-го образа ( ) ( ) ( f Snk(1,)j = 1 ), на выходе нейрона должно быть f Snk(2) = 1. Из (18) можно видеть, что минимальное значение функции состоянии k-го нейрона второго слоя ( Snk(2) ) будет тогда, когда значения ( ) f Sni(,1j) = 0 для всех i ≠ k, для которых wi(,2j) > 0 , и напротив,.

Здесь для всех нейронов третьего слоя пороговое значение
Тестируемый элемент

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.