Abstract

Problem statement: In the presence of multicollinearity, the estimation of parameters in multiple linear regression models by means of Ordinary Least Squares (OLS) is known to suffer severe distortion. An alternative approach was to use the modified OLS which was based on the latent roots and latent vectors of the correlation matrix of the independent and dependent variables. This procedure is called the Latent Root Regression (LRR) which serves the purpose to improve the stability of the estimates for data plagued by multicollinearity. However, there was evidence that the LRR estimators were easily affected by a few atypical observations that we call outliers. It is now evident that the robust method alone cannot rectify the combined problems of multicollinearity and outliers. Approach: In this study, we proposed a robust procedure for the estimation of the regression parameters in the presence of multicollinearity and outliers. We called this method Latent Root-M based Regression (LRMB) because here we employed the weight of the M-estimator in the weighted correlation matrix. Numerical examples and some simulation studies were presented to illustrate the performance of the newly proposed method. Results: Results of the study showed that the LRMB method is more efficient than the existing methods. Conclusion/Recommendations: In order to get a reliable estimate, we recommend using the LRMB when both multicollinearity and outliers are present in the data.

Highlights

  • Consider a multiple linear regression model: where, ri (β) = Y − Xβ

  • Numerical example: In order to compare the performance of the Latent Root-M based Regression (LRMB) with the other existing methods such as the Ordinary Least Squares (OLS), Latent Root Regression (LRR) and robust M-estimates (ROBM), two real data sets are considered

  • By incorporating the weight obtained from the final step of the ROBM estimator, yields the robust-weighted correlation matrix with the corresponding latent roots and latent vectors which are displayed in Table 1 and 2, respectively

Read more

Summary

Introduction

Consider a multiple linear regression model: where, ri (β) = Y − Xβ. This gives the OLS estimator for β:. The presence of multicollinearity will produce inflated standard errors that will lead to misleading parameter inferences. To remedy this problem, Hawkins[1], Gunst and Mason[2], Gunst et al.[3] and Lawrence and Arthur[4] have introduced a new biased estimation procedures known as Latent Root Regression (LRR) to improve the precision of the regression estimates.

Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call