Abstract

This paper proposes a framework for developing a broad variety of soft clustering and learning vector quantization (LVQ) algorithms based on gradient descent minimization of a reformulation function. According to the proposed axiomatic approach to learning vector quantization, the development of specific algorithms reduces to the selection of a generator function. A linear generator function leads to the fuzzy c-means (FCM) and fuzzy LVQ (FLVQ) algorithms while an exponential generator function leads to entropy constrained fuzzy clustering (ECFC) and entropy constrained LVQ (ECLVQ) algorithms. The reformulation of clustering and LVQ algorithms is also extended to supervised learning models through an axiomatic approach proposed for reformulating radial basis function (RBF) neural networks. This approach results in a broad variety of admissible RBF models, while the form of the radial basis functions is determined by a generator function. This paper shows that gradient descent learning makes reformulated RBF neural networks an attractive alternative to conventional feed-forward neural networks.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.