Abstract

AbstractEstimation of dynamic mixture distributions is a difficult task, because the density contains an intractable normalizing constant. To overcome this difficulty, we develop an approach that maximizes, by means of the cross‐entropy method, a Monte Carlo approximation of the log‐likelihood function. The proposed noisy cross‐entropy approach is unsupervised, since it does not require the specification of a threshold between the distributions. Moreover, it bypasses the evaluation of the normalizing constant, combining good statistical properties with a modest computational burden. Both simulation‐based evidence and empirical applications suggest that noisy cross‐entropy estimation is comparable or preferable to existing methods in terms of statistical efficiency, but is less demanding from the computational point of view.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call