Abstract

SummaryThis article introduces a network architecture, called dynoNet, utilizing linear dynamical operators as elementary building blocks. Owing to the dynamical nature of these blocks, dynoNet networks are tailored for sequence modeling and system identification purposes. The back‐propagation behavior of the linear dynamical operator with respect to both its parameters and its input sequence is defined. This enables end‐to‐end training of structured networks containing linear dynamical operators and other differentiable units, exploiting existing deep learning software. Examples show the effectiveness of the proposed approach on well‐known system identification benchmarks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call