Abstract

While learning an unknown input-output task, humans first strive to understand the qualitative structure of the function. Accuracy of performance is then improved with practice. In contrast, existing neural network function approximators do not have an explicit means for abstracting the qualitative structure of a target function. To fill this gap, we introduce the concept of function emulation, according to which the central goal of training is to “emulate” the qualitative structure of the target function. The framework of catastrophe or singularity theory is used to characterize the qualitative structure of a smooth function, which is organized by the critical points of the function. The proposed scheme of function emulation uses the radial basis function network to realize a modular architecture wherein each module emulates the target function in the neighborhood of a critical point. The network size required to emulate the target in the neighborhood of a critical point is shown to be related to a certain complexity measure of the critical point. For a large class of smooth functions, the present scheme produces a graph-like abstraction of the target, thereby providing a qualitative representation of a quantitative input-output relation. © 1997 Elsevier Science Ltd. All Rights Reserved.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call