Abstract

Physics-informed neural networks (PINN) have lately become a research hotspot in the interdisciplinary field of machine learning and computational mathematics thanks to the flexibility in tackling forward and inverse problems. In this work, we explore the generality of the PINN training algorithm for solving Hamilton-Jacobi equations, and propose physics-informed neural networks based on adaptive weighted loss functions (AW-PINN) that is trained to solve unsupervised learning tasks with fewer training data while physical information constraints are imposed during the training process. To balance the contributions from different constrains automatically, the AW-PINN training algorithm adaptively update the weight coefficients of different loss terms by using the logarithmic mean to avoid additional hyperparameter. Moreover, the proposed AW-PINN algorithm imposes the periodicity requirement on the boundary condition and its gradient. The fully connected feedforward neural networks are considered and the optimizing procedure is taken as the Adam optimizer for some steps followed by the L-BFGS-B optimizer. The series of numerical experiments illustrate that the proposed algorithm effectively achieves noticeable improvements in predictive accuracy and the convergence rate of the total training error, and can approximate the solution even when the Hamiltonian is nonconvex. A comparison between the proposed algorithm and the original PINN algorithm for Hamilton-Jacobi equations indicates that the proposed AW-PINN algorithm can train the solutions more accurately with fewer iterations.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call