Abstract
We present novel approximates of variational losses, being applicable for the training of physics-informed neural networks (PINNs). The formulations reflect classic Sobolev space theory for partial differential equations (PDEs) and their weak formulations. The loss approximates rest on polynomial differentiation realised by an extension of classic Gauss–Legendre cubatures, we term Sobolev cubatures, and serve as a replacement of automatic differentiation. We prove the training time complexity of the resulting Sobolev -PINNs with polynomial differentiation to be less than required by PINNs relying on automatic differentiation. On top of one-to-two order of magnitude speed-up the Sobolev-PINNs are demonstrated to achieve closer solution approximations for prominent forward and inverse, linear and non-linear PDE problems compared to established PINNs.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.