Abstract

As a robust data analysis technique, quantile regression has attracted extensive interest. In this study, the weighted quantile regression (WQR) technique is developed based on sparsity function. We first consider the linear regression model and show that the relative efficiency of WQR compared with least squares (LS) and composite quantile regression (CQR) is greater than 70% regardless of the error distributions. To make the pro- posed method practically more useful, we consider two nontrivial extensions. The first concerns with a nonparametric model. Local WQR estimate is introduced to explore the nonlinear data structure and shown to be much more efficient compared to other estimates under various non-normal error distributions. The second extension concerns with a multivariate problem where variable selection is needed along with regulation. We couple the WQR with penalization and show that under mild conditions, the penalized WQR en- joys the oracle property. The WQR has an intuitive formulation and can be easily implemented. Simulation is conducted to examine its finite sample performance and compare against alternatives. Analysis of mammal dataset is also conducted. Numerical studies are consistent with the theoretical findings and indicate the usefulness of WQR

Highlights

  • In practical regression analysis, it is common that the collected response data display heterogeneity due to either heteroscedastic variance or heavy tails of random errors

  • We develop the weighted quantile regression (WQR) method to further improve over QR and composite quantile regression (CQR)

  • We have proposed weighted quantile regression and proved its nice theoretical properties

Read more

Summary

Introduction

It is common that the collected response data display heterogeneity due to either heteroscedastic variance or heavy tails of random errors. To improve over the standard QR, the composite quantile regression (CQR; Zou and Yuan 2008) has been proposed It combines strengths across multiple quantile regression models and has been to outperform the standard QR. We develop the weighted quantile regression (WQR) method to further improve over QR and CQR. Like CQR, WQR combines strengths across multiple quantile regressions efficiently by using data-dependent weights at different quantiles. We first investigate the simple linear regression model. We consider a nonparametric regression model, develop the local WQR method, and investigate its theoretical properties especially including relative efficiency. We consider the scenario with multiple covariates where variable selection is needed along with estimation and the penalized WQR selection method is adopted. We first introduce the WQR method under the simple linear regression and study its properties. Some technical details and additional numerical study results are presented in Appendix

Weighted Quantile Regression Technique and Properties
Asymptotic relative efficiency
Local WQR for the nonlinear regression model
Penalized WQR for the multivariate model
Simulation Study
Real Data Analysis
Findings
Discussion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call