Abstract

Physics-informed neural networks (PINNs) have shown great potential in solving computational physics problems with sparse, noisy, unstructured, and multi-fidelity data. However, the training of PINN remains a challenge, and PINN is not robust to deal with some complex problems, such as the sharp local gradient in broad computational domains, etc. Transfer learning techniques can provide fast and accurate training for PINN through intelligent initialization, but the previous researches are much less effective when dealing with transfer learning cases with a large range of parameter variation, which also suffers from the same drawbacks. This manuscript develops the concept of the minimum energy path for PINN and proposes an adaptive transfer learning for PINN (AtPINN). The Partial Differential Equations (PDEs) parameters are initialized by the source parameters and updated adaptively to the target parameters during the training process, which can guide the optimization of PINN from the source to the target task. This process is essentially performed along a designed low-loss path, which is no barrier in the energy landscape of neural networks. Consequently, the stability of the training process is guaranteed. AtPINN is utilized to achieve transfer learning cases with a large range of parameter variation for solving five complex problems. The results demonstrate that AtPINN has promising potential for extending the application of PINN. Besides, three transfer learning cases with different ranges of parameter variation are analyzed through visualization. Furthermore, results also show that the idea of adaptive transfer learning can be a particular optimization strategy to directly solve problems without intelligent initialization.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call