ABSTRACTSeveral methods for the robust estimation of the variance of a normal random variable are based on trimming. For example, the popular estimator is based on trimming the pairs of samples with the largest absolute distance. Another example is least trimmed squares regression, where all unusually large residuals are removed before the estimation of the variance. Here, in this work, we propose two new classes of estimators that use an optimal linear combination of the trimmed samples to achieve a lower mean squared error (MSE). The first class guarantees the smallest variance among all unbiased estimators that are based on linear combinations, while the second class of estimators guarantees the smallest MSE at the cost of some small bias. Through simulations, we demonstrate that our proposed estimators can have considerably smaller MSE than other robust estimators both in the presence and absence of outliers.
Read full abstract