Abstract

Neural networks (NNs) have emerged as a state-of-the-art method for modeling nonlinear systems in model predictive control (MPC). However, the robustness of NNs, in terms of sensitivity to small input perturbations, remains a critical challenge for practical applications. To address this, we develop Lipschitz-Constrained Neural Networks (LCNNs) for modeling nonlinear systems and derive rigorous theoretical results to demonstrate their effectiveness in approximating Lipschitz functions, reducing input sensitivity, and preventing over-fitting. Specifically, we first prove a universal approximation theorem to show that LCNNs using SpectralDense layers can approximate any 1-Lipschitz target function. Then, we prove a probabilistic generalization error bound for LCNNs using SpectralDense layers by using their empirical Rademacher complexity. Finally, the LCNNs are incorporated into the MPC scheme, and a chemical process example is utilized to show that LCNN-based MPC outperforms MPC using conventional feedforward NNs in the presence of training data noise.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.