Abstract

The problem of learning partial differential equations (PDEs) from given data is investigated here. Several algorithms have been developed for PDE learning from accurate data sets. These include using sparse optimization for approximating the coefficients of candidate terms in a general PDE model. In this work, the study is extended to spatiotemporal data sets with various noise levels. We compare the performance of conventional and novel methods for denoising the data. Different architectures of neural networks are used to denoise data and approximate derivatives. These methods are numerically tested on the linear convection-diffusion equation and the nonlinear convection-diffusion equation (Burgers’ equation). Results suggest that modification in the hidden units/hidden layers in the network architecture have an effect on the accuracy of approximations with a significant rate. This is a further improvement on the previously known denoising methods of finite differences, polynomial regression splines and single layer neural network.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call