Abstract

In this paper we present analysis and solutions to problems related to initial positioning of neurons in a classic self-organizing map (SOM) neural network. This means that we are not concerned with the multitude of growing variants, where new neurons are placed where needed. For our work, we consider placing the neurons on a Hilbert curve, as SOM have the tendency to converge similarly to self-similar curves. Another point of adjustment in SOM is the initial number of neurons, which depends on the data set. Our investigations show that initializing the neurons on a self-similar curve such as Hilbert provides a quality coverage of the input topology in much less number of epochs as compared to the usual random neuron placement. The meaning of quality is measured by absence of tangles in the network, which is one-dimensional SOM utilizing the traditional Kohonen training algorithm. The tangling of SOM presents the problem of topologically close neighbors that are actually far apart in the neuron chain of the 1D network. This is related to issues of proper clustering and analysis of cluster labels and classification. We also experiment and provide analysis where the number of neurons is concerned.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.