Abstract

Randomized Neural Networks (RNNs) are a variety of neural networks in which the hidden-layer parameters are fixed to randomly assigned values, and the output-layer parameters are obtained by solving a linear system through least squares. This improves the efficiency without degrading the accuracy of the neural network. In this paper, we combine the idea of the Local RNN (LRNN) and the Discontinuous Galerkin (DG) approach for solving partial differential equations. RNNs are used to approximate the solution on the subdomains, and the DG formulation is used to glue them together. Taking the Poisson problem as a model, we propose three numerical schemes and provide convergence analysis. Then we extend the ideas to time-dependent problems. Taking the heat equation as a model, three space–time LRNN with DG formulations are proposed. Finally, we present numerical tests to demonstrate the performance of the methods developed herein. We evaluate the performance of the proposed methods by comparing them with the finite element method and the conventional DG method. The LRNN-DG methods can achieve higher accuracy with the same degrees of freedom, and can solve time-dependent problems more precisely and efficiently. This indicates that this new approach has great potential for solving partial differential equations.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call