Abstract

Cheung and Xu (2001) has presented a dual structural recurrent radial basis function (RBF) network by considering the different scales in net's inputs and outputs. However, such a network implies that the underlying functional relationship between the net's inputs and outputs is linear separable, which may not be true from a practical viewpoint. In this paper, we therefore propose a new recurrent RBF network. It takes the net's input and the past outputs as an augmented input in analogy with the one in Billings and Fung (1995), but introduces a scale tuner into the net's hidden layer to balance the different scales between inputs and outputs. This network adaptively learns the parameters in the hidden layer together with those in the output layer. We implement this network by using a variant of extended normalized RBF (Cheung and Xu (2001)) with its hidden units learned by the rival penalization controlled competitive learning algorithm. The experiments have shown the outstanding performance of the proposed network in recursive function estimation.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.