Abstract

The latest developments in Markov models’ theory and their corresponding computational techniques have opened new rooms for image and signal modeling. In particular, the use of Dempster–Shafer theory of evidence within Markov models has brought some keys to several challenging difficulties that the conventional hidden Markov models cannot handle. These difficulties are concerned mainly with two situations: multisensor data, where the use of the Dempster–Shafer fusion is unworkable; and nonstationary data, due to the mismatch between the estimated stationary model and the actual data. For each of the two situations, the Dempster–Shafer combination rule has been applied, thanks to the triplet Markov models’ formalism, to overcome the drawbacks of the standard Bayesian models. However, so far, both situations have not been considered in the same time. In this article, we propose an evidential Markov chain that uses the Dempster–Shafer combination rule to bring the effect of contextual information into segmentation of multisensor nonstationary data. We also provide the Expectation–Maximization parameters’ estimation and the maximum posterior marginal’s restoration procedures. To validate the proposed model, experiments are conducted on some synthetic multisensor data and noised images. The obtained segmentation results are then compared to those obtained with conventional approaches to bring out the efficiency of the present model.

Highlights

  • Hidden Markov chains (HMCs) have been used to solve a wide range of inverse problems occurring in many application fields

  • Pairwise and triplet Markov chains we briefly describe the pairwise Markov chains (PMCs) and the triplet Markov chains’ (TMC) that are more general than the conventional HMCs defined in the previous section

  • Multisensor nonstationary hidden Markov chains we describe our new model that will be called the multisensor nonstationary hidden Markov chain (MN-HMC), and we give its corresponding maximum posterior marginal (MPM) restoration and EM parameters estimation procedures

Read more

Summary

Introduction

Hidden Markov chains (HMCs) have been used to solve a wide range of inverse problems occurring in many application fields. They allow one to take contextual information within data into account. ; ωK g and let Y = Y1..N be an observable process that takes its values in R and that can be seen as a noisy version of X. According to the HMC formalism, the hidden process X has a Markov distribution, and this is why the model is qualified by “hidden Markov”. The observations Yn are assumed to be independent conditionally on X and the contextual information are considered only through X, which provides a well-designed formalism that permits to consider the data contextual information while keeping the model simple and the necessary estimation procedures workable. According to HMCs, the joint distribution of (X, Y) is given by

Objectives
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call