Abstract

In this paper two aspects of numerical dynamics are used for an artificial neural network (ANN) analysis. It is shown that topological conjugacy of gradient dynamical systems and both the shadowing and inverse shadowing properties have nontrivial implications in the analysis of a perceptron learning process. The main result is that, generically, any such process is stable under numerics and robust. Implementation aspects are discussed as well. The analysis is based on the theorem concerning global topological conjugacy of cascades generated by a gradient flow on a compact manifold without a boundary.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call