Abstract

In the presence of multi-collinearity problem, the parameter estimation method based on the ordinary least squares procedure is unsatisfactory. In 1970, Hoerl and Kennard insert analternative method labeled as estimator of ridge regression.
 In such estimator, ridge parameter plays an important role in estimation. Various methods were proposed by many statisticians to select the biasing constant (ridge parameter). Another popular method that is used to deal with the multi-collinearity problem is the principal component method. In this paper,we employ the simulation technique to compare the performance of principal component estimator with some types of ordinary ridge regression estimators based on the value of the biasing constant (ridge parameter). The mean square error (MSE) is used as a criterion to assess the performance of such estimators.

Highlights

  • Consider the linear regression model (1)where is 1 vector of response variable, is matrix of explanatory variablesand n > p, is 1 vector of unknown parameters, is 1 vector of unobservable random errors andThe aim of regression analysis is to estimate the numerical values of linear model parameters.Recently, biased estimators of regression parameters get attention of many researchers, because the ordinary least squares procedure is unable to provide reasonable point estimates when the matrix of explanatory variables if there exists the problem of multi-collineardata.Where we refer through the paper to the ridge regression estimators and principal component estimators as alternatives to the ordinary least square estimators with multi-collineardata

  • The problem of multi-collinearity occurs when there exists an exact linear relationship or an approximate linear relationship among two or more explanatory variables, two types of multicollinearity may be faced in regression analysis, exact and near multi-collnearity

  • Ordinary ridge estimates are computed using different ridge parameters given in equation (8) to (11) and the principal components regression given equations (12) to (15)

Read more

Summary

Introduction

Consider the linear regression model (1)where is 1 vector of response variable, is matrix of explanatory variablesand n > p, is 1 vector of unknown parameters, is 1 vector of unobservable random errors andThe aim of regression analysis is to estimate the numerical values of linear model parameters.Recently, biased estimators of regression parameters get attention of many researchers, because the ordinary least squares procedure is unable to provide reasonable point estimates when the matrix of explanatory variables if there exists the problem of multi-collineardata.Where we refer through the paper to the ridge regression estimators and principal component estimators as alternatives to the ordinary least square estimators with multi-collineardata. 0, The aim of regression analysis is to estimate the numerical values of linear model parameters.

Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.