Abstract
Abstract One-dimensional Hidden Markov models (1-D HMMs) are statistical models of sequential data that have been used successfully in many machine learning applications, especially for speech recognition. Two-dimensional hidden Markov model (2-D HMM) is an extension of 1-D HMM to 2-D, it provides a reasonable statistical method to model matrix data. This paper describes the structure of third-order 2-D HMM in which probability of a transition not only depends immediate vertical and horizontal states but also depends on immediate diagonal state, and gives a study of learning algorithms for third-order 2-D HMM. By using the ideal that the sequences of states on columns or rows of a third-order 2-D HMM can be seen as states of a 1-D HMM and building up an associated objective function using Lagrange multiplier method, several new formulae solving model learning algorithms are theoretically derived.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: International Journal of Advancements in Computing Technology
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.