Abstract

As a typical application of deep learning, physics-informed neural network (PINN) has been successfully used to find numerical solutions of partial differential equations (PDEs), but how to improve the limited accuracy is still a great challenge for PINN. In this work, we introduce a new method, symmetry-enhanced physics informed neural network (SPINN) where the invariant surface conditions induced by the Lie symmetries or non-classical symmetries of PDEs are embedded into the loss function in PINN, to improve the accuracy of PINN for solving the forward and inverse problems of PDEs. We test the effectiveness of SPINN for the forward problem via two groups of ten independent numerical experiments using different numbers of collocation points and neurons per layer for the Korteweg-de Vries (KdV) equation, breaking soliton equation, heat equation, and potential Burgers equations respectively, and for the inverse problem by considering different layers and neurons as well as different numbers of training points with different levels of noise for the Burgers equation in potential form. The numerical results show that SPINN performs better than PINN with fewer training points and simpler architecture of neural network, and in particular, exhibits superiorities than the PINN method and the two-stage PINN method of Lin and Chen by considering the Sawada-Kotera equation. Furthermore, we discuss the computational overhead of SPINN in terms of the relative computational cost to PINN and show that the training time of SPINN has no obvious increases, even less than PINN for certain cases.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call