Abstract

Forecasting models with high-order interaction has become popular in many applications since researchers gradually notice that an additive linear model is not adequate for accurate forecasting. However, the excessive number of variables with low sample size in the model poses critically challenges to predication accuracy. To enhance the forecasting accuracy and training speed simultaneously, an interpretable model is essential in knowledge recovery. To deal with ultra-high dimensionality, this paper investigates and studies a two-stage procedure to demand sparsity within high-order interaction model. In each stage, square root hard ridge (SRHR) method is applied to discover the relevant variables. The application of square root loss function facilitates the parameter tuning work. On the other hand, hard ridge penalty function is able to handle both the high multicollinearity and selection inconsistency. The real data experiments reveal the superior performances to other comparing approaches.

Highlights

  • Sparse representation has attracted a large deal of attention from researchers in different scientific fields including financial engineering and risk management computational biology and machine learning community [1]

  • We investigate and study the square root hard ridge (SRHR) method which combines the benefits of square root loss function and nonconvex hard ridge penalty function for high-order interaction selection

  • two-stage LASSO (TSLASSO) selects a model with 6311 threeway interaction terms while two-stage SCAD (TSSCAD) only selects 3 variables which is fewer than those of two-stage square root lasso (TSSRL) and two-stage square root hard ridge method (TSSRHR)

Read more

Summary

Introduction

Sparse representation has attracted a large deal of attention from researchers in different scientific fields including financial engineering and risk management computational biology and machine learning community [1]. Researchers have noticed that a linear additive model using main effects only cannot provide accurate forecasting results This motivates them to turn their attentions to high-order interaction model which considers interaction terms between variables. Tibshrani (1996) proposed the least absolute shrinkage and selection operator (LASSO) which applies L1 type penalty to pursue sparsity In wavelet study, it is famous as basis pursuit method [9]. Bien et al (2013) studied hierarchical LASSO (HL) which considered a L1 optimization problem using a nonconvex constraint ‖φj‖1 ≤ |bj| [22] She et al (2014) studied GRESH method which enforced sparsity to main effects and two-way interactions simultaneously [23]. (i) A two-stage square root hard ridge method (TSSRHR) for sparse representation in high-order interaction model is proposed.

Two-Stage Square Root Hard Ridge Method
Algorithm
Prediction and Estimation
Real World Data
Findings
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call