Abstract
This paper presents a new learning paradigm that consists of a Hebbian and anti-Hebbian learning. A layer of radial basis functions is adapted in an unsupervised fashion by minimizing a two-element cost function. The first element maximizes the output of each gaussian neuron and it can be seen as an implementation of the traditional Hebbian learning law. The second element of the cost function reinforces the competitive learning by penalizing the correlation between the nodes. Consequently, the second term has an “anti-Hebbian” effect that is learned by the gaussian neurons without the implementation of lateral inhibition synapses. Therefore, the decorrelated Hebbian learning (DHL) performs clustering in the input space avoiding the “nonbiological” winner-take-all rule. In addition to the standard clustering problem, this paper also presents an application of the DHL in function approximation. A scaled piece-wise linear approximation of a function is obtained in the supervised fashion within the local regions of its domain determined by the DHL. For comparison, a standard single hidden-layer gaussian network is optimized with the initial centers corresponding to the DHL. The efficiency of the algorithm is demonstrated on the chaotic Mackey-Glass time series.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.