Abstract

The linear regression model explores the relationship between the dependent variable and the independent variables. The ordinary least squared estimator (OLSE) is widely applicable to estimate the parameters of the model. However, OLSE suffered a breakdown when the independent variables are linearly dependent- a condition called multicollinearity. The Kibria-Lukman estimator (KLE) was suggested as an alternative to the OLSE and some other estimators (ridge and Liu estimators). In this paper, we developed a Jackknifed version of the Kibria-Lukman estimator- the estimator is named the Jackknifed KL estimator (JKLE). We derived the statistical properties of the new estimator and compared it theoretically with the KLE and some other existing estimators. Theoretically, the result revealed that JKLE possesses the lowest MSE when compared with the KLE and some other existing estimators. Finally, JKLE reduced the bias and the mean squared error (MSE) of KLE in both simulation and real-life analysis. JKLE dominates other methods considered in this study.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call