Abstract

We consider a numerical scheme for the approximation of a system that couples the evolution of a two-dimensional hypersurface to a reaction–diffusion equation on the surface. The surfaces are assumed to be graphs and evolve according to forced mean curvature flow. The method uses continuous, piecewise linear finite elements in space and a backward Euler scheme in time. Assuming the existence of a smooth solution, we prove optimal error bounds both in $L^\infty(L^2)$ and in $L^2(H^1)$. We present several numerical experiments that confirm our theoretical findings and apply the method in order to simulate diffusion induced grain boundary motion.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call