Abstract
In this paper, we consider a model called CHARME (Conditional Heteroscedastic Autoregressive Mixture of Experts), a class of generalized mixture of nonlinear (non)parametric AR-ARCH time series. The main objective of this paper is to learn the autoregressive and volatility functions of this model with neural networks (NN). This approach is justified thanks to the universal approximation capacity of neural networks. On the other hand, in order to build the learning theory, it is necessary first to prove the ergodicity of the CHARME model. We therefore show in a general nonparametric framework that under certain Lipschitz-type conditions on the autoregressive and volatility functions, this model is stationary, ergodic and $$\tau $$ -weakly dependent. These conditions are much weaker than those in the existing literature. Moreover, this result forms the theoretical basis for deriving an asymptotic theory of the underlying parametric estimation, which we present for this model in a general parametric framework. Altogether, this allows to develop a learning theory for the NN-based autoregressive and volatility functions of the CHARME model, where strong consistency and asymptotic normality of the considered estimator of the NN weights and biases are guaranteed under weak conditions. Numerical experiments are reported to support our theoretical findings.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.