We present hidden abstract stack Markov models (HASMMs) with their learning process. The HASMMs we offer carry the more expressive nature of probabilistic context-free grammars (PCFGs) while allowing faster parameter fitting of hidden Markov models (HMMs). Both HMMs and PCFGs are widely utilized structured models, offering an effective formalism capable of describing diverse phenomena. PCFGs are better accommodated than HMMs such as for expressing natural language processing; however, HMMs outperform PCFGs for parameter fitting. We extend HMMs towards PCFGs for such applications, by associating each state of an HMM with an abstract stack, which can be thought of as the single-stack alphabet of pushdown automata (PDA). As a result, we leverage the expressive capabilities of PCFGs for such applications while mitigating the cubic complexity of parameter learning in the observation sequence length of PCFGs by adopting the bilinear complexity of HMMs.
Read full abstract