Abstract

A common problems in multiple regression models are multicollinearity and non-normal errors, which produce undesirable effects on the least squares estimators. So, it would seem important to combine methods of estimation designed to deal with these problems. In this paper, a comparative investigation was done experimentally for some different estimation methods, which namely the ordinary least squares ( LS) , Ridge Regression ( R ), Ridge least Absolute Deviation (RLAD), Weighted Ridge (WR), Robust M(M) and Robust Ridge regression based on M-estimation (RM). From a simulation study, the resulting robust ridge regression estimator (RM)is efficient than other estimators, using the mean squared error criterion for many combinations of error distribution and degree of multicollinearity.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call