Abstract

Multicollinearity is a linear dependency between two or more explanatory variables in the regression models which can seriously distort the least squares estimates. The Ordinary Least Squares Estimator is an unbiased estimator that is used to estimate the unknown parameters in the model. The variance of the Ordinary Least Squares Estimates would be inflated and the regression coefficients often indetermined in the presence of multicollinearity. Therefore, biased estimators are suggested as alternatives to the Ordinary Least Squares Estimator. In this study, a new method of solving multicollinearity problem through perturbation of eigenvalues is proposed. The performance of this estimator is evaluated by comparing it with some existing estimators in terms of mean squared error. The first new estimator is compared with the Ordinary Least Squares Estimator, Principal Component regression and the Ordinary Ridge Regression Estimator.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call