Abstract

Physics-informed neural networks (PINNs) have been widely used to solve partial differential equations in recent years. But studies have shown that there is a gradient pathology in PINNs. That is, there is an imbalance gradient problem in each regularization term during back-propagation, which makes it difficult for neural network models to accurately approximate partial differential equations. Based on the depth-weighted residual neural network and neural attention mechanism, we propose a new mixed-weighted residual block in which the weighted coefficients are chosen autonomously by the optimization algorithm, and one of the transformer networks is replaced by a skip connection. Finally, we test our algorithms with some partial differential equations, such as the non-homogeneous Klein–Gordon equation, the (1+1) advection–diffusion equation, and the Helmholtz equation. Experimental results show that the proposed algorithm significantly improves the numerical accuracy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call