We are interested in restoring images having values in a symmetric Hadamard manifold by minimizing a functional with a quadratic data term and a total variation--like regularizing term. To solve the convex minimization problem, we extend the Douglas--Rachford algorithm and its parallel version to symmetric Hadamard manifolds. The core of the Douglas--Rachford algorithm is reflections of the functions involved in the functional to be minimized. In the Euclidean setting the reflections of convex lower semicontinuous functions are nonexpansive. As a consequence, convergence results for Krasnoselski--Mann iterations imply the convergence of the Douglas--Rachford algorithm. Unfortunately, these general results do not carry over to Hadamard manifolds, where proper convex lower semicontinuous functions can have expansive reflections. However, splitting our restoration functional in an appropriate way, we have only to deal with special functions---namely, several distance-like functions and an indicator function of a special convex set. We prove that the reflections of certain distance-like functions on Hadamard manifolds are nonexpansive, which is an interesting result on its own. Furthermore, the reflection of the involved indicator function is nonexpansive on Hadamard manifolds with constant curvature so that the Douglas--Rachford algorithm converges here. Several numerical examples demonstrate the advantageous performance of the suggested algorithm compared to other existing methods such as the cyclic proximal point algorithm and half-quadratic minimization. Numerical convergence is also observed in our experiments on the Hadamard manifold of symmetric positive definite matrices with the affine invariant metric, which does not have a constant curvature.