Abstract

Recent developments in numerous scientific computing fields have been made possible by physics-informed neural networks (PINNs), a class of machine learning techniques that incorporate physical knowledge into neural networks. However, PINNs still encounter difficulties in simulating a kind of partial differential equations (PDEs), i.e. gradient flow equations of complex dynamic systems. Through analysis from the neural tangent kernel's (NTK) point of view, we find that directly training PINNs on the large temporal window may lead to failure. To tackle this problem, we utilize one of the gradient flow equations' significant physical properties, the energy dissipation law, to propose an adaptive energy-based sequential training method for PINNs. Specifically, the gradient of free energy is employed to determine the adaptive temporal steps of the training window, which also enhances the training efficiency. Numerical experiments are demonstrated among gradient flow equations including Cahn-Hilliard, heat conduction, and Allen-Cahn equations. Our method not only outperforms PINNs but also preserves the energy dissipation law, which shows the ability to solve complex long-time simulation problems of gradient flow systems.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.