Abstract
The non-Gaussian simulation method using Hermite polynomials expansion presented in a previous article is improved. It is aimed to simulate the paths of a strictly stationary non-Gaussian process given the N-first moments of its one-dimensional marginal distribution and its autocorrelation function. The present new model consists in using the maximum entropy principle in order to determine its marginal distribution. This allows to obtain new results of convergence. The convergences of this new model are then examined and the method is illustrated by some examples.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.