Abstract

1.1 Preliminaries In this chapter we focus on what Rabiner in his popular tutorial (Rabiner, 1989) calls “uncovering the hidden part of the model” or “Problem 2”, that is, hidden path inference. We consider a hidden Markov model (X,Y) = {(Xt,Yt)}t∈Z, where Y = {Yt}t∈Z is an unobservable, or hidden, homogeneous Markov chain with a finite state space S = {1, . . . ,K}, transition matrix P = (pi,j)i,j∈S and, whenever relevant, the initial probabilities πs = P(Y1 = s), s ∈ S. A reader interested in extensions to the continuous case is referred to (Cappe et al., 2005; Chigansky & Ritov, 2010). The Markov chain will be further assumed to be of the first order, bearing in mind that a higher order chain can always be converted to a first order one by expanding the state space. To simplify the mathematics, we assume that the Markov chain Y is stationary and ergodic. This assumption is needed for the asymptotic results in Section 3, but not for the finite time-horizon in Section 2. In fact, Section 2 does not even require the assumption of homogeneity. The second component X = {Xt}t∈Z is an observable process with Xt taking values in X that is typically a subspace of the Euclidean space, i.e. X ⊂ Rd. The process X can be thought of as a noisy version of Y. In order for (X,Y) to be a hidden Markov model, the following properties need to be satisfied:

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.