Abstract

We present the Evidential Hidden Markov Models (EvHMM), an extension of probabilistic HMM for time-series modeling, where conditional belief functions are used in place of probabilities to manage uncertainty on discrete latent variables. Inference and learning mechanisms are described and allow to solve the three problems initially defined for HMM, namely, the classification problem (find the most plausible model), the decoding problem (finding the best sequence of hidden states), and the learning problem based on incomplete and uncertain data (estimate the parameters). Exact inference mechanisms based on the GBT are proposed which allows one to recover standard HMM when probabilities are considered. An EM-like procedure is developed for parameter learning, relying on some approximations suggested to make the solutions tractable. Relationships are discussed with both the learning criterion conjectured by Vannoorenberghe and Smets and the formulation of EMCs by Pieczynski et al. A comparison with standard HMM on simulated data confirms the interest of considering random disjunctive sets to represent data incompleteness in evidential temporal graphical models.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call