Abstract

In this article, we explore the effects of memory terms in continuous-layer Deep Residual Networks by studying Neural ODEs (NODEs). We investigate two types of models. On one side, we consider the case of Residual Neural Networks with dependence on multiple layers, more precisely Momentum ResNets. On the other side, we analyse a Neural ODE with auxiliary states playing the role of memory states. We examine the interpolation and universal approximation properties for both architectures through a simultaneous control perspective. We also prove the ability of the second model to represent sophisticated maps, such as parametrizations of time-dependent functions. Numerical simulations complement our study.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call