Abstract
Variance parameter estimation in linear mixed models is a challenge for many classical nonlinear optimization algorithms due to the positive-definiteness constraint of the random effects covariance matrix. Many existing algorithms may get stuck on the boundary of the feasible space. In this paper, we address this problem by exploiting the intrinsic geometry of the parameter space, pulling iterates away from the boundary. In this novel approach, we formulate the problem of residual maximum likelihood estimation as an optimization problem on a Riemannian manifold. Based on the introduced formulation, we give geometric higher-order information on the problem via the Riemannian gradient and the Riemannian Hessian, making the algorithm more robust against singularities. We test our approach with Riemannian optimization algorithms numerically. Our approach yields a much better quality of the variance parameter estimates compared to existing approaches.
Published Version
Join us for a 30 min session where you can share your feedback and ask us any queries you have