Abstract

In this article we give a general methodology to build and work with functional networks, a network‐based alternative to the neural networks paradigm. In functional networks, neural functions are allowed to be not only multivariate but also truly multiargument and different for all neurons. Thus neural functions instead of weights are learned. In addition, outputs coming from different neurons can be connected, that is, forced to output the same values. The topology and neuron functions of functional networks can be selected based on data, domain knowledge, or a combination of the two. Functional equations play an important role in functional networks, since the preceding types of connections lead to functional equations that impose a substantial reduction in the degrees of freedom of the initial neural functions. Some methods are given to obtain equivalent functional and differential equations, and they are applied to approximating the solutions of differential equations problems. The examples of an associative operator, a cantilever beam, and a mass supported by two springs and a viscous damper are given to illustrate the methods and show their power.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call