Abstract

We test the naive model to forecast ex-ante Value-at-Risk (VaR) using a shrinkage estimator between realized volatility estimated on past return time series, and implied volatility quoted on the market. Implied volatility is often indicated as the operators expectation about future risk, while the historical volatility straightforwardly represents the realized risk prior to the estimation point, which by definition is backward looking. Therefore, the VaR prediction strategy uses information both on the expected future risk and on the past estimated risk. We examine the model presented by (Cesarone and Colucci, 2016), called Shrunk Volatility VaR, an we validate it in the multivariate framework in particular on US equity and bonds, empirically comparing its forecasting power with that of four benchmark VaR models. The performance of all VaR models is validated using both statistical accuracy and efficiency evaluation tests on 39 equal spaced balanced portfolios over an out-of-sample period that covers different crises. We evaluate model performances on four VaR confidence level (95%, 99%, 99.5% and 99.9%). We also validate the models under loss function backtests and our results confirm the efficacy of the implied volatility indexes as inputs for a VaR model, combined with realized volatilities. Furthermore, we confirm the conclusion of (Cesarone and Colucci, 2016) almost in all balanced portfolios. We also relax the main assumption of the model (Normally distributed returns) in favor of a fat tail distribution (t-Student) and we find that, switching from Normality to t-Student ad vice-versa, the Shrunk Volatility VaR could be used as a tool for portfolio managers to quickly monitor investment decisions before employing more sophisticated risk management systems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call