Abstract

In this paper, the performance of existing biased estimators (Ridge Estimator (RE), Almost Unbiased Ridge Estimator (AURE), Liu Estimator (LE), Almost Unbiased Liu Estimator (AULE), Principal Component Regression Estimator (PCRE), r-k class estimator and r-d class estimator) and the respective predictors were considered in a misspecified linear regression model when there exists multicollinearity among explanatory variables. A generalized form was used to compare these estimators and predictors in the mean square error sense. Further, theoretical findings were established using mean square error matrix and scalar mean square error. Finally, a numerical example and a Monte Carlo simulation study were done to illustrate the theoretical findings. The simulation study revealed that LE and RE outperform the other estimators when weak multicollinearity exists, and RE, r-k class and r-d class estimators outperform the other estimators when moderated and high multicollinearity exist for certain values of shrinkage parameters, respectively. The predictors based on the LE and RE are always superior to the other predictors for certain values of shrinkage parameters.

Highlights

  • It is well known that the misspecification of the linear model is unavoidable in practical situations

  • The performance of existing biased estimators (Ridge Estimator (RE), Almost Unbiased Ridge Estimator (AURE), Liu Estimator (LE), Almost Unbiased Liu Estimator (AULE), Principal Component Regression Estimator (PCRE), r-k class estimator and r-d class estimator) and the respective predictors were considered in a misspecified linear regression model when there exists multicollinearity among explanatory variables

  • Note that the predictors based on the ordinary least square estimator (OLSE), RE, AURE, LE, AULE, PCRE, r-k class estimator and r-d class estimator are denoted by y, yk, y AURE, yd, y AULE, yPCR, yrk and yrd respectively

Read more

Summary

Introduction

It is well known that the misspecification of the linear model is unavoidable in practical situations. It is well known that the ordinary least square estimator (OLSE) does not hold its desirable properties if multicollinearity exists among the explanatory variables in the regression model To overcome this problem, biased estimators based on the sample model =y X β + ε , or by combining sample model with the exact or stochastic restrictions have been used in the literature. Teräsvirta [4] was discussed the case of biased estimation with stochastic linear restrictions in the misspecified regression model due to including an irrelevant variable with incorrectly specified prior information. The efficiency of Mixed Regression Estimator (MRE) under misspecified regression model due to excluding relevant variable with correctly specified prior information was discussed by Mittelhmmer [5], Ohtani and Honda [6], Kadiyala [7] and Trenkler and Wijekoon [8]. The references and Appendix are given at the end of the paper

Model Specification
Mean Square Error Comparisons
Monte Carlo Simulation Study
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call