Abstract

In spite of the popularity of model calibration in finance, empirical researchers have put more emphasis on model estimation than on the equally important goodness-of-fit problem. This is due partly to the ignorance of modelers, and more to the ability of existing statistical tests to detect specification errors. In practice, models are often calibrated by minimizing the sum of squared difference between the modelled and actual observations. It is challenging to disentangle model error from estimation error in the residual series. To circumvent the difficulty, we study an alternative way of estimating the model by exact calibration. We argue that standard time series tests based on the exact approach can better reveal model misspecifications than the error minimizing approach. In the context of option pricing, we illustrate the usefulness of exact calibration in detecting model misspecification. Under heteroskedastic observation error structure, our simulation results shows that the Black-Scholes model calibrated by exact approach delivers more accurate hedging performance than that calibrated by error minimization.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call