Abstract

CMAC (Cerebellar Model Articulation Controller) neural networks are capable of learning nonlinear functions extremely quickly due to the local nature of the weight updating. The rectangular shape of CMAC receptive field functions, however, produces discontinuous (staircase) function approximations without inherent analytical derivatives. The ability to learn both functions and function derivatives is important for the development of many on-line adaptive filter, estimation, and control algorithms. It is shown that use of B-Spline receptive field functions in conjunction with more general CMAC weight addressing schemes allows higher-order CMAC neural networks to be developed that can learn both functions and function derivatives. This also allows novel hierarchical and multi-layer CMAC network architectures to be constructed that can be trained using standard error back-propagation learning techniques.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call