Abstract

The Expectation Maximization (EM) algorithm is a versatile tool for model parameter estimation in latent data models. When processing large data sets or data stream however, EM becomes intractable since it requires the whole data set to be available at each iteration of the algorithm. In this contribution, a new generic online EM algorithm for model parameter inference in general Hidden Markov Model is proposed. This new algorithm updates the parameter estimate after a block of observations is processed (online). The convergence of this new algorithm is established, and the rate of convergence is studied showing the impact of the block-size sequence. An averaging procedure is also proposed to improve the rate of convergence. Finally, practical illustrations are presented to highlight the performance of these algorithms in comparison to other online maximum likelihood procedures.

Highlights

  • A hidden Markov model (HMM) is a stochastic process {Xk, Yk}k≥0 in X × Y, where the state sequence {Xk}k≥0 is a Markov chain and where the observations {Yk}k≥0 are independent conditionally on {Xk}k≥0

  • Each iteration is decomposed into two steps: the E-step computes the conditional expectation of the complete data log-likelihood given the observations and the M-step updates the parameter estimate based on this conditional expectation

  • When the complete data likelihood belongs to the cruved exponential family, the E-step is replaced by a stochastic approximation step while the M-step remains unchanged

Read more

Summary

Introduction

A hidden Markov model (HMM) is a stochastic process {Xk, Yk}k≥0 in X × Y, where the state sequence {Xk}k≥0 is a Markov chain and where the observations {Yk}k≥0 are independent conditionally on {Xk}k≥0. When the complete data likelihood belongs to the cruved exponential family, the E-step is replaced by a stochastic approximation step while the M-step remains unchanged The convergence of this online variant of the EM algorithm for i.i.d. observations is addressed by [4]: the limit points are the stationary points of the Kullback-Leibler divergence between the marginal distribution of the observation and the model distribution. There do not exist convergence results for these online EM algorithms for general state-space models (some insights on the asymptotic behavior are given in [3]): the introduction of many approximations at different steps of the algorithms makes the analysis quite challenging In this contribution, a new online EM algorithm is proposed for HMM with complete data likelihood belonging to the curved exponential family. All the proofs are postponed to Section 6; supplementary proofs and comments are provided in [20]

Notations and Model assumptions
Application to inverse problems in Hidden Markov Models
Linear Gaussian Model
Finite state-space HMM
Assumptions
The limiting EM algorithm
Rate of convergence of the Block Online EM algorithms
Proofs
A Technical results

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.