Abstract

A hidden Markov model (HMM) encompasses a large class of stochastic process models and has been successfully applied to a number of scientific and engineering problems, including speech and other pattern recognition problems, and biological sequence analysis. A major restriction is found, however, in conventional HMM, i.e., it is ill-suited to capture the interactions among different models. A variety of coupled hidden Markov models (CHMMs) have recently been proposed as extensions of HMM to better characterize multiple interdependent sequences. The resulting models have multiple state variables that are temporally coupled via matrices of conditional probabilities. This paper study is focused on the coupled discrete HMM, there are two state variables in the network. By generalizing forward-backward algorithm, Viterbi algorithm and Baum-Welch algorithm commonly used in conventional HMM to accommodate two state variables, several new formulae solving the 2-chain coupled discrete HMM probability evaluation, decoding and training problem are theoretically derived.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call