Abstract

AbstractPhysics-informed neural networks learn not by example, but by checking physical patterns in a limited set of sampling points. Fully connected deep neural networks, implemented in deep learning libraries, for example, TensorFlow, are usually used as physics-informed neural networks for solving partial differential equations. An important role in the popularity of these libraries is played by the automatic differentiation implemented in them and modern learning algorithms. It is proposed to use radial basis functions networks as physics-informed neural networks, which are distinguished by a simple structure and the ability to adjust not only linear, but also nonlinear parameters. The authors have developed a fast algorithm for the Levenberg-Marquardt method for learning radial basis functions networks. Extensions of the TensorFlow library have been developed to implement the Levenberg-Marquardt algorithm and radial basis functions networks. The model problems solution has shown the advantages of using the radial basis functions networks implemented in TensorFlow as physics-informed neural networks.KeywordsPhysics-informed neural networksRadial basis functions networksFully connected deep neural networksPartial differential equationsNeural network learningAdam’s algorithmLevenberg-Marquardt algorithm

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call