Abstract

Statistical inference with the ordinary least squares (OLS) estimator is frequently influenced when there is a multicollinearity in the linear regression model. In this article, to reduce these effects of multicollinearity, we generalize the modified Kibria–Lukman principal component (MKLPC) estimator in the linear regression model by combining the principal component regression (PCR) estimator and the modified Kibria–Lukman (MKL) estimator. Meanwhile, the necessary and sufficient conditions for the superiority of the MKLPC estimator over OLS, PCR, Ridge, r-k, Liu, r-d, k-d, KL, and MKL estimators in the mean squared error (MSE) criterion are derived. Furthermore, we conduct Monte Carlo simulation and empirical analysis to compare these estimators under the MSE criterion.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call