Abstract

This paper explores three models to estimate volatility: exponential weighted moving average (EWMA), generalized autoregressive conditional heteroskedasticity (GARCH) and stochastic volatility (SV). The volatility estimated by these models can be used to measure the market risk of a portfolio of assets, called Value at Risk (VaR). VaR depends on the volatility, time horizon and confidence interval for the continuous returns under analysis. For empirical assessment of these models, we used a sample based on Petrobras stock prices to specify the GARCH and SV models. Additionally, we adjusted these models by violation backtesting for one-day VaR, to compare the efficiency of the SV, GARCH and EWMA volatility models (suggested by RiskMetrics). The results suggest that VaR calculated considering EWMA was less violated than when considering SV and GARCH for a 1500-observation window. Hence, for our sample, the model suggested by RiskMetrics (1999), which uses exponential smoothing and is easier to implement, did not produce inferior violation test results when compared to more sophisticated tests such as SV and GARCH.

Highlights

  • FThe main objective of volatility models is to provide a measure that can be used in managing financial risks, helping in the selection of portfolio assets and in derivatives pricing

  • The present paper suggests the use of autoregressive conditional heteroskedasticity and stochastic volatility models to predict the volatility used in Value at Risk (VaR) measures

  • We use a violation test to compare the VaR limits of the models obtained by generalized autoregressive conditional heteroskedasticity (GARCH), stochastic volatility (SV) and that suggested by RiskMetrics (1999) for the marked-to-the-market returns our portfolio of Petrobras shares

Read more

Summary

INTRODUCTION

FThe main objective of volatility models is to provide a measure that can be used in managing financial risks, helping in the selection of portfolio assets and in derivatives pricing. The first approach, which is based on the model proposed by RiskMetrics, the most common method among VaR users, utilizes exponential smoothing with a decay factor λ of 0.94 and assumes the returns are normally distributed This approach can be considered a particular case of the generalized autoregressive conditional heteroskedasticity (GARCH) model, and according to Jorion They compared the models employed by the banks with the VaR calculated based on an ARMA(1,1) plus GARCH(1,1) model, assuming a normal distribution They found by backtesting that the banks’ VaR, more conservative, did not follow the profit and loss (P&L) volatility of their portfolios and was outperformed by the GARCH model in terms of violation of the VaR limits.

GARCH MODEL
DESCRIPTION OF THE DATA
BACKTESTING
FINAL CONSIDERATIONS
Conclusion
Findings
Summary statistics
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call