Abstract

This study aims at combining the machine learning technique with the Hausdorff derivative to solve one-dimensional Hausdorff derivative diffusion equations. In the proposed artificial neural network method, the multilayer feed-forward neural network is chosen and improved by using the Hausdorff derivative to the activation function of hidden layers. A trial solution is a combination of the boundary and initial condition terms and the network output, which can approximate the analytical solution. To transform the original Hausdorff derivative equation into a minimization problem, an error function is defined, where the coefficients are approximated by using the gradient descent algorithm in the back-propagation process. Two numerical examples are given to illustrate the accuracy and the robustness of the proposed method. The obtained results show that the improved machine learning technique is efficient in computing the Hausdorff derivative diffusion equations both from computational accuracy and stability.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call