Abstract
A new method for computing the standard errors of returns-based risk and performance estimators for serially correlated returns is developed. The method uses the fact that any such estimator can be represented as the sum of returns that are transformed using the estimator’s influence function, and the fact that the variance of such a sum can be estimated by estimating the zero-frequency value of the spectral density of the influence function transformed returns. The spectral density is estimated by fitting a polynomial to the periodogram using a generalized linear model for exponential distributions, with elastic net regularization. An adaptive prewhitening is used to obtain good performance for large as well as small serial correlation of returns. We show that the method works much better in comparison to conventional standard error computational methods, for a collection of 13 hedge funds whose returns have varying degrees of serial correlation. Extensive Monte Carlo mean-squared error performance studies for a number of commonly used risk and performance estimators, using first-order autoregression returns, show that the new method delivers good performance for serial correlations ranging from 0.0 to 0.9, over which range it has better performance than alternative Newey-West methods. Simulations also show that the new method works well for Garch(1,1) returns processes
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.