Abstract

There has been interest in the use of classifiers based on the product of experts (PoE). PoEs offer an alternative to the standard mixture of experts (MoE) framework. This paper presents a particular form of PoE, the product of Gaussians (PoG), within an hidden Markov model framework. Training and initialisation procedures are described for this PoG system. In addition, the relationship of PoG to standard multiple stream systems is explored. The PoG system performance is examined on the SwitchBoard task and is compared to standard Gaussian mixture systems and multiple stream systems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call