Abstract

Abstract We introduce a robust regression estimator for time series factor models called the mOpt estimator. This estimator minimizes the maximum bias due to outlier generating distribution deviations from a standard normal errors distribution factor model, and at the same time has a high normal distribution efficiency. The efficacy of this estimator is demonstrated in applications to single factor and multifactor time series models. Extensive empirical investigation of the mOpt robust betas versus non-robust least square betas show that differences between the two estimates greater than 0.3 occur for about 18% of the stocks, and differences greater than 0.5 occur for about 7.5% of the stocks. We introduce and demonstrate the use of a robust statistical test for differences between mOpt and least squares factor model coefficients, and also a new robust model selection method that makes natural use of the mOpt regression estimator. It is highly recommend that practitioners and data service providers compute robust mOpt betas as a standard practice complement to least squares betas.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call