Abstract

Linear regression models a relation between a response variable with regressor variables and produce regression model. Ordinary least squares (OLS) method is often utilized to estimate parameters of linear regression model. OLS method will produce the best linear unbiased estimates when all the assumptions are met. The “best” means the estimate is unbiased and has the minimum variance. However, in real data, those assumptions are often violated, such as the multicollinearity and outliers. Multicollinearity will produce a large variance of the regression parameters, while outliers will cause a biased estimate. Jackknife ridge M-estimator is recommended to be implemented in the model when these problems are present. Jackknife ridge M-estimator is a parameter estimation method of the regression with robust property, which makes robust to outliers, and ridge method is utilized to overcome multicollinearity. Whereas the jackknife method is utilized to reduce the bias from the result of the ridge method. In multiple linear regression, regression model could possibly contain many candidates for regressor variables. Subset selection method is utilized to select some of those regressors that best predicts the response variable. The selection criterion that can be utilized on the subset selection method when outlier and multicollinearity are present is the GSp criterion. The reason for this is because the GSp criterion is based on jackknife ridge M-estimator, which is able to solve the problem of outlier and multicollinearity. Furthermore, GSp criterion is implemented to obtain the best model on predicting IQ based on several predictors. It was found that IQ could be predicted by only doing 3 personality test which is test 1, test 3 and test 5 instead of all 5 different personality test.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call