Abstract

In this paper, we propose an algorithm for estimating the parameters of a time-homogeneous hidden Markov model (HMM) from aggregate observations. This problem arises when only the population level counts of the number of individuals at each time step are available, and one seeks to learn the individual HMM from these observations. Our algorithm is built upon the classical expectation–maximization algorithm and the recently proposed aggregate inference algorithm (Sinkhorn belief propagation). We present the parameter learning algorithm for two different settings of HMMs: one with discrete observations and one with continuous observations, and the algorithm exhibits convergence guarantees in both cases. Moreover, our learning framework naturally reduces to the standard Baum–Welch learning algorithm for HMMs when the population size is 1. The efficacy of our algorithm is demonstrated through several numerical experiments.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call