Abstract

We propose a solution to the measurement error problem that plagues the estimation of the relation between the expected return of the stock market and its conditional variance due to the latency of these conditional moments. We use intra-period returns to construct a nonparametric proxy for the latent conditional variance in the first step which is subsequently used as an input in the second step to estimate the parameters characterizing the risk–return tradeoff via a GMM approach. We propose a bias-correction to the standard GMM estimator derived under a double asymptotic framework, wherein the number of intra-period returns, N, as well as the number of low frequency time periods, T, are simultaneously large. Simulation exercises show that the bias-correction is particularly relevant for small values of N which is the case in empirical scenarios involving long time periods. The methodology lends itself to additional applications, such as the empirical evaluation of factor models, wherein the factor betas may be estimated using intra-period returns and the unexplained returns or alphas subsequently recovered at lower frequencies.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call