The most popular estimator for estimating parameters of linear regression models is the Ordinary Least Squares (OLS) Estimator. The OLS is considered the best linear unbiased estimator when certain assumptions are not violated. However, when autocorrelation, multicollinearity, and heavy-tail error are jointly present in the dataset, the OLS estimator is inefficient and imprecise. In this paper, we developed an estimator of linear regression model parameters that jointly handle multicollinearity, autocorrelation, and heavy tail errors. The new estimator, LADHLKL, was derived by combining the Hildreth-Lu (HL), the Kibria Lukman (KL), and the Least Absolute Deviation (LAD) estimators. The LADHLKL poses both the characteristics of the LAD, HL, and KL estimators which makes it resistant to both problems. We examined the properties of the proposed estimator and compared its performance with other existing estimators in terms of mean square error. An application to real-life data and simulation study revealed that the proposed estimator dominates other estimators in all the considered conditions in terms of mean square error.
Read full abstract