Abstract

This article presents a design principle of a neural network using Gaussian activation functions, referred to as a Gaussian Potential Function Network (GPFN), and explores the capability of a GPFN in learning a continuous input-output mapping from a given set of teaching patterns. The design principle is highlighted by a Hierarchically Self-Organizing Learning (HSOL) algorithm featuring the automatic recruitment of hidden units under the paradigm of hierarchical learning.A GPFN generates an arbitrary shape of a potential field over the domain of the input space, as an input-output mapping, by synthesizing a number of Gaussian potential functions provided by individual hidden units referred to as Gaussian Potential Function Units (GPFUs). The construction of a GPFN is carried out by the HSOL algorithm which incrementally recruits the minimum necessary number of GPFUs based on the control of the effective radii of individual GPFUs, and trains the locations (mean vectors) and shapes (variances) of individual Gaussian potential functions, as well as their summation weights, based on the Backpropagation algorithm.Simulations were conducted for the demonstration and evaluation of the GPFNs constructed based on the HSOL algorithm for several sets of teaching patterns.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call