Abstract

A linear regression model, involves independent errors. But in practice, it is often violated. A linear regression with correlated errors appears in many contexts. Correlated errors can seriously affect the robustness of the linear regression model. The robustness of the parameter estimators in the linear regression model, can be saved using the M-estimator. However, it acquires robustness for the sake of efficiency. While, it has been shown that the minimum Matusita distance estimators, achieves both robustness and efficiency. On the other hand, because the estimators obtained from the Cochrane and Orcutt adjusted least squares estimation are not affected by the correlation of errors, so they have a good efficiency. This article, using a non-parametric kernel density estimation, proposes a new method for obtaining minimum Matusita distance estimator of the linear regression model with correlated errors in the presence of outliers. Simulation and real data study are carried out for the proposed estimation. In simulation, the proposed estimator in the linear regression model with correlated errors, represents smaller biases and mean squared errors than the M-estimator and the Cochran and Orcutt adjusted least squares estimator. In real data, the proposed estimator, has smaller standard errors than two other estimators.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call