Abstract

The issue of training a neural network to approximate a multivariate function is addressed at length in the current literature. The lack of a non-iterative or closed form solution for the weights of sigmoid neural networks motivates an alternate view of neural networks. It has long been known that Cerebellar Model Articulation Controller-type networks are able to approximate multivariate functions, as well as being directly programmable given a training data set. CMAC networks, however, suffer from the curse of dimensionality regarding the exponential growth in the number of receptive field functions that must be employed as the dimension of the function increases. The work in this paper studies the use of networks that limit the number of dimensions onto which high dimensional functions must be projected for approximation purposes. While preserving the direct programmability of CMAC networks, the work here uses frequency information about the function to limit the number of couplings used in the approximation. It is shown that for bandlimited functions, the number of network parameters or weights required to achieve zero approximation error has only a polynomial growth with dimension. At the conclusion of the paper, a procedure is given for estimating the size of a CMAC network based on the frequency bandwidth of the function to be approximated.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.