Abstract

The article presents a study of solutions of ODEs system with a special nonlinear part, which is a continuous analogue of an arbitrary recurrent neural network (neural ODEs). As a nonlinear part of the mentioned system of differential equations, we used sums of piecewise continuous functions, where each term is a power function. (These are activation functions.) The use of power activation functions (PAF) in neural networks is a generalization of well-known the rectified linear units (ReLU). In the present time ReLU are commonly used to increase the depth of trained of a neural network. Therefore, the introduction of PAF into neural networks significantly expands the possibilities of ReLU. Note that the purpose of introducing power activation functions is that they allow one to obtain verifiable Lyapunov stability conditions for solutions of the system differential equations simulating the corresponding dynamic processes. In turn, Lyapunov stability is one of the guarantees of the adequacy of the neural network model for the process under study. In addition, from the global stability (or at least the boundedness) of continuous analog solutions it follows that learning process of the corresponding neural network will not diverge for any training sample.

Highlights

  • One of the fundamental goals of machine learning is modeling and understanding real-world phenomena from observations

  • There is a third problem: (iii) if a neural network models a certain dynamic process, how to guarantee the stability or boundedness of solutions of a system of differential equations describing a continuous analog of the aforementioned neural network?

  • Section 3 introduces the concept of a piecewise continuous power activation function (PAF) and presents its main properties;

Read more

Summary

Introduction

One of the fundamental goals of machine learning is modeling and understanding real-world phenomena from observations. The key contributions of this article are: Section 3 introduces the concept of a piecewise continuous power activation function (PAF) and presents its main properties; The conditions for the global stability of neural ODEs with PAFs, found with the help of the well-known method of Lyapunov functions, are given in Sections 4 and 5. Note that checking these conditions is reduced only to checking the negative definiteness of two known matrices. These results will be used to construct new types of chaotic neural networks

The relationship between neural networks and differential equations
Mathematical preliminaries
A B k1 k2 r r
Even and odd activation functions
First use of power functions in neural ODEs
Second use of power functions in neural ODEs
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call