Abstract

Volatility analysis plays a major role in finance and economics. It is the key input for many financial topics including risk management, option and derivative pricing. One pressing computational hurdle in high frequency financial statistics is the tremendous amount of data and the optimization procedures that require computing power beyond the currently available desktop systems. In this article, we focus on the statistical inference problem on large volatility matrix using high-frequency financial data, and propose a regularization approach to achieve lower prediction errors. We also applied a hybrid parallelization solution to carry out efficient computations for high dimensional statistical methods via intra-day high-frequency data. A variety of hardware and software based HPC techniques, including parallel R, Intel Math Kernel Library, and automatic offloading to Intel Xeon Phi coprocessor are applied to speed up the statistical computations. Our numerical studies are based on high-frequency price data on stocks traded in New York Stock Exchange in 2013. The analysis results show that the constructed estimator using regularization approach generally achieves higher prediction power while enjoying faster convergence rate. We demonstrate significant performance improvement on statistical inference for high-frequency financial data by combining both software and hardware parallelism.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.