Abstract

We propose new hybrid Lagrange neural networks called LaNets to predict the numerical solutions of partial differential equations. That is, we embed Lagrange interpolation and small sample learning into deep neural network frameworks. Concretely, we first perform Lagrange interpolation in front of the deep feedforward neural network. The Lagrange basis function has a neat structure and a strong expression ability, which is suitable to be a preprocessing tool for pre-fitting and feature extraction. Second, we introduce small sample learning into training, which is beneficial to guide the model to be corrected quickly. Taking advantages of the theoretical support of traditional numerical method and the efficient allocation of modern machine learning, LaNets achieve higher predictive accuracy compared to the state-of-the-art work. The stability and accuracy of the proposed algorithm are demonstrated through a series of classical numerical examples, including one-dimensional Burgers equation, one-dimensional carburizing diffusion equations, two-dimensional Helmholtz equation and two-dimensional Burgers equation. Experimental results validate the robustness, effectiveness and flexibility of the proposed algorithm.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call