Abstract

We discuss a model selection procedure, the adaptive ridge selector, derived from a hierarchical Bayes argument, which results in a simple and efficient fitting algorithm. The hierarchical model utilized resembles an un-replicated variance components model and leads to weighting of the covariates. We discuss the intuition behind this type estimator and investigate its behavior as a regularized least squares procedure. While related alternatives were recently exploited to simultaneously fit and select variablses/features in regression models (Tipping in J Mach Learn Res 1:211–244, 2001; Figueiredo in IEEE Trans Pattern Anal Mach Intell 25:1150–1159, 2003), the extension presented here shows considerable improvement in model selection accuracy in several important cases. We also compare this estimator’s model selection performance to those offered by the lasso and adaptive lasso solution paths. Under randomized experimentation, we show that a fixed choice of tuning parameter leads to results in terms of model selection accuracy which are superior to the entire solution paths of lasso and adaptive lasso when the underlying model is a sparse one. We provide a robust version of the algorithm which is suitable in cases where outliers may exist.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call