Abstract

Multicollinearity is an important issue affecting the results of regression analysis. LASSO developed in recent years has great advantages in selecting explanatory variables, processing high-dimensional data, and solving multicollinearity problems. This method adds a penalty term to the model estimation, which can compress the regression coefficients of some unnecessary variables to zero and then remove them from the model to achieve the purpose of variable screening. This paper focuses on the LASSO method and compares it with optimal subsets, ridge regression, adaptive LASSO, and elastic net results. It is found that both LASSO and adaptive LASSO have good performance in solving independent variable multicollinearity problems and enhancing model interpretation and prediction accuracy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call