This paper provides a rigorous convergence rate and complexity analysis for a recently introduced framework, called PDE acceleration, for solving problems in the calculus of variations and explores applications to obstacle problems. PDE acceleration grew out of a variational interpretation of momentum methods, such as Nesterov’s accelerated gradient method and Polyak’s heavy ball method, that views acceleration methods as equations of motion for a generalized Lagrangian action. Its application to convex variational problems yields equations of motion in the form of a damped nonlinear wave equation rather than nonlinear diffusion arising from gradient descent. These accelerated PDEs can be efficiently solved with simple explicit finite difference schemes where acceleration is realized by an improvement in the CFL condition from $$\mathrm{d}t\sim \mathrm{d}x^2$$ for diffusion equations to $$\mathrm{d}t\sim \mathrm{d}x$$ for wave equations. In this paper, we prove a linear convergence rate for PDE acceleration for strongly convex problems, provide a complexity analysis of the discrete scheme, and show how to optimally select the damping parameter for linear problems. We then apply PDE acceleration to solve minimal surface obstacle problems, including double obstacles with forcing, and stochastic homogenization problems with obstacles, obtaining state-of-the-art computational results.