Abstract

The ordinary least squares (OLS) regression models only the conditional mean of the response and is computationally less expensive. Quantile regression on the other hand is more expensive and rigorous but capable of handling vectors of quantiles and outliers. Quantile regression does not assume a particular parametric distribution for the response, nor does it assume a constant variance for the response, unlike least squares regression. This paper examines the impact of various quantiles (tau vector) on the parameter estimates in the models generated by the quantile regression analysis. Two data sets, one with normal random error with non-constant variances and the other with a constant variance were simulated. It is observed that with heteroscedastic data the intercept estimate does not change much but the slopes steadily increase in the models as the quantile increase. Considering homoscedastic data, results reveal that most of the slope estimates fall within the OLS confidence interval bounds, only few quartiles are outside the upper bound of the OLS estimates. The hypothesis of quantile estimates equivalence is rejected, which shows that the OLS is not appropriate for heteroscedastic data, but the assumption is not rejected in the case of homoscedastic data at 5% level of significance, which clearly proved that the quantile regression is not necessary in a constant variance data. Using the following accuracy measures, mean absolute percentage error (MAPE), the median absolute deviation (MAD) and the mean squared deviation (MSD), the best model for the heteroscedastic data is obtained at the first quantile level (tau = 0.10).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call