Energy-Dissipative Evolutionary Deep Operator Neural Network is an operator learning neural network. It is designed to seek numerical solutions for a class of partial differential equations instead of a single partial differential equation, such as partial differential equations with different parameters or different initial conditions. The network consists of two sub-networks, the Branch net and the Trunk net. For an objective operator G, the Branch net encodes different input functions f at a fixed number of sensors xi,i=1,2,⋯,m, and the Trunk net evaluates the output function at any location. By minimizing the error between the evaluated output q and the expected output G(f)(y) at the test point y, DeepONet generates a good approximation of the operator G. A key distinction in our methodology is the utilization of DeepONet for the training of the initial state, which operates as a multi-parametric operator. Further, the evolution of parameters in our model is facilitated by the Scalar Auxiliary Variable (SAV) method, leading to a formulated iterative process for the parameter's progression over time. The SAV approach is adopted to preserve essential physical properties of PDEs, particularly the Energy Dissipation Law. It introduces a kind of modified energy and establishes unconditional energy dissipation law in the discrete level. By treating the parameters of the well-trained DeepONet as a representation of the initial operator and evolving them by a dynamic system, our network can predict the accurate solution at any further time, while the training data is only the initial state. In order to validate the accuracy and efficiency of our neural networks, we provide numerical simulations of several partial differential equations, including heat equations, parametric heat equations, Allen-Cahn equations and a reaction–diffusion equation in three dimensions.
Read full abstract