Abstract

In this paper, we consider the estimation of the parameters of the non-orthogonal regression model, when we suspect a sparsity condition. We provide with a comparative performance characteristics of the primary penalty estimators, namely, the ridge and the LASSO, with the least square estimator, restricted LSE, preliminary test and Stein-type of estimators, when the dimension of the parameter space is less than the dimension of the sample space. Using the principle of marginal distribution theory, the analysis of risks leads to the following conclusions: (i) ridge estimator outperforms least squares, preliminary test and Stein-type estimators uniformly, (ii) The restricted least squares estimator and LASSO are competitive, although LASSO lags behind the restricted least squares estimator uniformly. Both estimators outperform the least squares, preliminary test, and Stein-type estimators in a subspace, respectively. (iii) The lower bound risk expression of LASSO does not depend on the threshold parameter. (iv) Performance of the estimators depends upon the size of numbers of active coefficients, non-active coefficients, and the divergence parameter. In support of our conclusion, we prepare some tables and graphs relevant to the properties of the estimators.

Highlights

  • We use least squares estimators (LSEs) for a linear model which provide minimum variance unbiased estimators

  • They are compared to the LSE, restricted LSE (RLSE), preliminary test estimator (PTE) and Stein-type Shrinkage estimators, James-Stein estimator (JSE) and positive rule Stein estimator (PRSE)

  • We discovered the following conclusions: (i) Ridge estimator outperforms the LSE, preliminary-test estimator (PTE) and Stein-type estimator, (ii) The restricted estimator and least absolute shrinkage and selection operator (LASSO) are competitive, LASSO, lags behind RLSE uniformly

Read more

Summary

Introduction

We use least squares estimators (LSEs) for a linear model which provide minimum variance unbiased estimators. This paper is devoted to the comparative study of the finite sample performance of the primary penalty estimators, namely, LASSO and the ridge regression estimators They are compared to the LSE, restricted LSE (RLSE), preliminary test estimator (PTE) and Stein-type Shrinkage estimators, James-Stein estimator (JSE) and positive rule Stein estimator (PRSE). Conclusions are obtained based on the lower bound of L2-risk expression for the LASSO estimator provided by Donoho and Johnstone [4] The comparison of these estimators discussed here are based on mathematical analysis supported by tables of relative weighted L2-risk efficiency (RWRE) and graphs. In his pioneering paper, Tibshirani [12] examined the relative efficiency of the subset selection, ridge regression and the LASSO in three different scenarios, under orthogonality of the design matrix:.

Linear Model and the Estimators
Penalty Estimators
Shrinkage Estimators
Bias and weighted L2-risks of Estimators
Modified LASSO
Comparison of LSE with RLSE
Comparison of LSE with PTE
Comparison of LSE with JSE and PRSE
Comparison of LSE and RLSE with RRE
Comparison of modified LASSO with LSE and RLSE
Application
Summary and Concluding Remarks
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call