Abstract

This article proposes a general Gaussian estimation approach for situations where the implementation of maximum likelihood estimation is difficult. The primary task is to construct a Gaussian estimation function by putting exact expressions of the mean vector and the variance–covariance matrix of the response vector into the log-likelihood function of the multivariate normal distribution. A Gaussian estimator is derived by maximizing the Gaussian estimation function. This construction can induce an optimality condition that the true parameter vector is the unique maximizer of the expected value of the Gaussian estimation function. The optimality condition is equally important to that given by the Kullback–Leibler information number in the maximum likelihood approach. It is a major condition in the derivation of nice theoretical properties such as consistency and asymptotic normality. The general Gaussian estimation approach can significantly reduce the computational burden when the log-likelihood function of a statistical model contains intractable high-dimensional integrals. By applying it to the Poisson-lognormal model, a special case of generalized linear mixed effect models, a closed-form (i.e., without intractable integrals) estimation approach for fixed effect parameters and variance components is derived. The simulation study shows that the resulting estimator is precise, reliable, and computationally efficient.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call