Abstract

The optimal state sequence of a generalized High-Order Hidden Markov Model (HHMM) is tracked from a given observational sequence using the classical Viterbi algorithm. This classical algorithm is based on maximum likelihood criterion. We introduce an entropy-based Viterbi algorithm for tracking the optimal state sequence of a HHMM. The entropy of a state sequence is a useful quantity, providing a measure of the uncertainty of a HHMM. There will be no uncertainty if there is only one possible optimal state sequence for HHMM. This entropy-based decoding algorithm can be formulated in an extended or a reduction approach. We extend the entropy-based algorithm for computing the optimal state sequence that was developed from a first-order to a generalized HHMM with a single observational sequence. This extended algorithm performs the computation exponentially with respect to the order of HMM. The computational complexity of this extended algorithm is due to the growth of the model parameters. We introduce an efficient entropy-based decoding algorithm that used reduction approach, namely, entropy-based order-transformation forward algorithm (EOTFA) to compute the optimal state sequence of any generalized HHMM. This EOTFA algorithm involves a transformation of a generalized high-order HMM into an equivalent first-order HMM and an entropy-based decoding algorithm is developed based on the equivalent first-order HMM. This algorithm performs the computation based on the observational sequence and it requires OTN~2 calculations, where N~ is the number of states in an equivalent first-order model and T is the length of observational sequence.

Highlights

  • State sequence for the Hidden Markov Model (HMM) is invisible but we can track the most likelihood state sequence based on the model parameter and a given observational sequence

  • The state entropy in HHHM is computed recursively for the reason of reducing the computational complexity from O(NkT) which used direct evaluation method to O(TNk+1) in a High-Order Hidden Markov Model (HHMM) where N is the number of states, T is the length of observational sequence, and k is the order of the Hidden Markov Model

  • Our algorithm proposes the transformation of a generalized high-order into an equivalent first-order HMM and compute the state entropy based on the equivalent first-order model; our algorithm is the most efficient in which it requires O(TÑ2) calculations as compared to the direct evaluation method which requires O(NT+k−1) calculations and the extended algorithm which requires O(TNk+1) calculations where N is the number of states in a model, Ñ is the number of states in an equivalent first-order model, T is the length of observational sequence, and k is the order of HMM

Read more

Summary

Introduction

State sequence for the Hidden Markov Model (HMM) is invisible but we can track the most likelihood state sequence based on the model parameter and a given observational sequence. The classical Viterbi algorithm is the most common technique for tracking state sequence from a given observational sequence [2]. It does not measure the uncertainty present in the solution. Hernando et al [4] proposed a method of using entropy for measuring the uncertainty of the state sequence of a first-order HMM tracked from a single observational sequence with a length of T.

Entropy-Based Decoding Algorithm with an Extended Approach
Entropy-Based Decoding Algorithm with a Reduction Approach
Conclusion and Future Work
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call