Abstract

The first result of applying the machine/deep learning technique to the fluid closure problem is presented in this paper. As a start, three different types of neural networks [multilayer perceptron (MLP), convolutional neural network (CNN), and two-layer discrete Fourier transform (DFT) network] were constructed and trained to learn the well-known Hammett–Perkins Landau fluid closure in configuration space. We find that in order to train a well-preformed network, a minimum size of the training data set is needed; MLP also requires a minimum number of neurons in the hidden layers that equals the degrees of freedom in Fourier space, despite the fact that training data are being fed into the configuration space. Out of the three models, DFT performs the best for the clean data, most likely due to the existence of the simple Fourier expression for the Hammett–Perkins closure, but it is the least robust with respect to input noise. Overall, with appropriate tuning and optimization, all three neural networks are able to accurately predict the Hammett–Perkins closure and reproduce the intrinsic nonlocal feature, suggesting a promising path to calculating more sophisticated closures with the machine/deep learning technique.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.