Abstract

SummaryMulticollinearity results in inflation in the variance of the ordinary least squares estimators due to the correlation between two or more independent variables (including the constant term). A widely applied solution is to estimate with penalised estimators such as the ridge estimator, which trade off some bias in the estimators to gain a reduction in the variance of these estimators. Although the variance diminishes with these procedures, all seem to indicate that the inference and goodness of fit are controversial. Alternatively, the raise regression allows mitigation of the problems associated with multicollinearity without the loss of inference or the coefficient of determination. This paper completely formalises the raise estimator. For the first time, the norm of the estimator, the behaviour of the individual and joint significance, the behaviour of the mean squared error and the coefficient of variation are analysed. We also present the generalisation of the estimation and the relation between the raise and the residualisation estimators. To have a better understanding of raise regression, previous contributions are also summarised: its mean squared error, the variance inflation factor, the condition number, adequate selection of the variable to be raised, the successive raising, and the relation between the raise and the ridge estimator. The usefulness of the raise regression as an alternative to mitigate multicollinearity is illustrated with two empirical applications.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call