Abstract
This work explores annealed cooperative–competitive learning of multiple modules of Mahalanobis normalized radial basis functions (NRBF) with applications to nonlinear function approximation and chaotic differential function approximation. A multilayer neural network is extended to be composed of multiple Mahalanobis-NRBF modules. Each module activates normalized outputs of radial basis functions, determining Mahalanobis radial distances based on its own adaptable weight matrix. An essential cooperative scheme well decomposes learning a multi-module network to sub-tasks of learning individual modules. Adaptable network interconnections are asynchronously updated module-by-module based on annealed cooperative–competitive learning for function approximation under a physical-like mean-field annealing process. Numerical simulations show outstanding performance of annealed cooperative–competitive learning of a multi-module Mahalanobis-NRBF network for nonlinear function approximation and long term look-ahead prediction of chaotic time series.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.