Abstract

The self-organizing hidden Markov model map (SOHMMM) introduces a hybrid integration of the self-organizing map (SOM) and the hidden Markov model (HMM). Its scaled, online gradient descent unsupervised learning algorithm is an amalgam of the SOM unsupervised training and the HMM reparameterized forward-backward techniques. In essence, with each neuron of the SOHMMM lattice, an HMM is associated. The image of an input sequence on the SOHMMM mesh is defined as the location of the best matching reference HMM. Model tuning and adaptation can take place directly from raw data, within an automated context. The SOHMMM can accommodate and analyze deoxyribonucleic acid, ribonucleic acid, protein chain molecules, and generic sequences of high dimensionality and variable lengths encoded directly in nonnumerical/symbolic alphabets. Furthermore, the SOHMMM is capable of integrating and exploiting latent information hidden in the spatiotemporal dependencies/correlations of sequences’ elements.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.