Abstract
AbstractDeep learning‐based methods for solving partial differential equations have become a research hotspot. The approach builds on the previous work of applying deep learning methods to partial differential equations, which avoid the need for meshing and linearization. However, deep learning‐based methods face difficulties in effectively solving complex turbulent systems without using labeled data. Moreover, issues such as failure to converge and unstable solution are frequently encountered. In light of this objective, this paper presents an approximation‐correction model designed for solving the seepage equation featuring unsteady boundaries. The model consists of two neural networks. The first network acts as an asymptotic block, estimating the progression of the solution based on its asymptotic form. The second network serves to fine‐tune any errors identified in the asymptotic block. The solution to the unsteady boundary problem is achieved by superimposing these progressive blocks. In numerical experiments, both a constant flow scenario and a three‐stage flow scenario in reservoir exploitation are considered. The obtained results show the method's effectiveness when compared to numerical solutions. Furthermore, the error analysis reveals that this method exhibits superior solution accuracy compared to other baseline methods.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Similar Papers
More From: International Journal for Numerical Methods in Fluids
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.