Abstract

The learning problem of continuous hidden Markov models (CHMMs) is the most critical and challenging one for the application of CHMMs. This paper aims to attack the learning problem of CHMMs by using the diversified gradient descent (DGD) algorithm. The novel learning formula of CHMM parameters, requiring no special form of the objective function and yielding various parameter estimates with different degree of diversity, is derived through dynamically adjusting the iterative procedure according to the gradient change of each parameter. It is the first work for standard CHMM attempting to obtain more local maxima so that the global maximum of the likelihood function of CHMM can be better approximated or even discovered. Hence this paper takes an important step forward in solving the learning problem of CHMM. Furthermore, a likelihood-based model averaging (LBMA) estimator is developed to achieve robust parameter estimation of CHMM based upon the diversiform models attained by the DGD algorithm. The proposed methods are tested on simulation and real-life bearing fault diagnosis problem. The results show that proposed methods perform better in parameter estimation and bearing fault diagnosis compared to the conventional methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call