Abstract

Statistics practitioners have been depending on the ordinary least squares (OLS) method in the linear regression model for generation because of its optimal properties and simplicity of calculation. However, the OLS estimators can be strongly affected by the existence of multicollinearity which is a near linear dependency between two or more independent variables in the regression model. Even though in the presence of multicollinearity the OLS estimate still remained unbiased, they will be inaccurate prediction about the dependent variable with the inflated standard errors of the estimated parameter coefficient of the regression model. It is now evident that the existence of high leverage points which are the outliers in x-direction are the prime factor of collinearity influential observations. In this paper, we proposed some alternative to regression methods for estimating the regression parameter coefficient in the presence of multiple high leverage points which cause the multicollinearity problem. This procedure utilized the ordinary least squares estimates of the parameter as the initial followed by an estimate of the ridge regression. We incorporated the Least Trimmed Squares (LTS) robust regression estimate to down weight the effects of multiple high leverage points which lead to the reduction of the effects of multicollinearity. The result seemed to suggest that the RLTS give a substantial improvement over the Ridge Regression.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call