Abstract

This paper presents a versatile hyper-ellipsoidal basis function for function approximation in a given high dimensional space. This hyper-ellipsoidal basis function can be translated and rotated to cover the data based upon the distribution of data in a given high dimensional space. Based on this function, we propose a one-pass hyper-ellipsoidal learning algorithm for which any new incoming data can be fed for learning without involving the previously learned one. This learning algorithm is used to adjust the parameters of the versatile hyper-ellipsoidal basis function. In addition, we propose the hyper-ellipsoidal basis function (HEBF) neural network that uses the one-pass hyper-ellipsoidal neural learning algorithm. The structure of this neural network is similar to the radial basis function (RBF) neural networks. The hidden neurons in the HEBF neural network can be increased or decreased during learning process. The number of the hidden neurons in the network can be grown based on geometric growth criterion and can be reduced by merging the two hidden neurons into a new hidden neuron based on merging criterion during learning process. The merging process can be done independently without considering the learned data set.KeywordsRadial Basis FunctionHide NeuronRadial Basis Function Neural NetworkHigh Dimensional SpaceInitial WidthThese keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call