Abstract

In this paper we prove that any affine function defined on a d-simplex in R d can be uniformly approximated by a single-layer neural network having only two neurons irrespective of d. The weights of this network are obtained in a closed analytical form, without training. This fact gives a correspondence rule that allows to transform mathematical approximants based on piecewise affine functions, into neural networks. We introduce such an approximant, adaptive splitting based on cubature (ASBC), for the efficient approximation of continuous functions. Using ASBC and the above correspondence rule, we obtain a neural tree. Numerical experiments on learning the function distance from a variable point to a geometric body in two and three dimensions show fast learning speed and high accuracy when compared with single-hidden layer feedforward networks trained by a trust region method based on the interior-reflective Newton algorithm.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call