Abstract
This letter presents a new deep learning-based framework for robust nonlinear estimation and control using the concept of a Neural Contraction Metric (NCM). The NCM uses a deep long short-term memory recurrent neural network for a global approximation of an optimal contraction metric, the existence of which is a necessary and sufficient condition for exponential stability of nonlinear systems. The optimality stems from the fact that the contraction metrics sampled offline are the solutions of a convex optimization problem to minimize an upper bound of the steady-state Euclidean distance between perturbed and unperturbed system trajectories. We demonstrate how to exploit NCMs to design an online optimal estimator and controller for nonlinear systems with bounded disturbances utilizing their duality. The performance of our framework is illustrated through Lorenz oscillator state estimation and spacecraft optimal motion planning problems.
Highlights
P ROVABLY stable and optimal state estimation and control algorithms for a class of nonlinear dynamical systems with external disturbances are essential to develop autonomous robotic explorers operating remotely on land, in water, and in deep space
We sample contraction metrics by solving an optimization problem with exponential stability constraints, the objective of which is to minimize an upper bound of the steady-state Euclidean distance between perturbed and unperturbed system trajectories
We present a Neural Contraction Metric (NCM), a deep learning-based global approximation of an optimal contraction metric for online nonlinear estimation and control
Summary
P ROVABLY stable and optimal state estimation and control algorithms for a class of nonlinear dynamical systems with external disturbances are essential to develop autonomous robotic explorers operating remotely on land, in water, and in deep space. The convex optimization-based sampling methodology in our framework allows us to obtain a large enough dataset of the optimal contraction metric without assuming any hypothesis function space These sampled metrics, the existence of which is a necessary and sufficient condition for exponential convergence [5], can be approximated with arbitrary accuracy due to the high representational power of the deep LSTM-RNN. We could alternatively rely on numerical schemes for sampling data points of a Lyapunov function without assuming any hypothesis function space This includes the state-dependent Riccati equation method [24], [25] and it is proposed in [9]–[11] that this framework can be improved to obtain an optimal contraction metric, which minimizes an upper bound of the steady-state mean squared tracking error for nonlinear stochastic systems. The NCM addresses this issue by approximating the sampled solutions by the LSTM-RNN
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.