Abstract

It is demonstrated that the sampling distributions of the maximum likelihood (ML) estimator and its Studentized statistic for the generalized Gaussian distribution do not pass the most powerful normality tests even for fairly large sample sizes. This disagreement with what the standard large sample ML theory predicts and the computational burden of having to deal with its associated polygamma functions motivate the consideration of a competing convexity-based estimator. The asymptotic normality of this estimator is derived. It is shown that the competing estimator is almost as efficient as the ML estimator and its asymptotic relative efficiency to the ML estimator is equal to 1 in the limit as the shape parameter approaches zero. More important, its asymptotic distribution admits an exact variance stabilizing transformation, whereas the asymptotic variance function of the ML estimator does not have a closed-form variance stabilizing transformation. The exact transformation is a composition of the inverse hyperbolic cotangent and square root functions. Besides stabilizing the variance, the inverse hyperbolic cotangent and square root transformation is remarkably effective for symmetrizing and normalizing the sampling distribution of the estimator and hence improving the standard normal approximation. Furthermore, this simple transformation provides a quite accurate approximation to the non-closed-form variance stabilizing transformation of the ML estimator.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call