Abstract
Self-organizing neural networks are usually focused on prototype learning, while the topology is held fixed during the learning process. Here a method to adapt the topology of the network so that it reflects the internal structure of the input distribution is proposed. This leads to a self-organizing graph, where each unit is a mixture component of a mixture of Gaussians (MoG). The corresponding update equations are derived from the stochastic approximation framework. This approach combines the advantages of probabilistic mixtures with those of self-organization. Experimental results are presented to show the self-organization ability of our proposal and its performance when used with multivariate datasets in classification and image segmentation tasks.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.