Abstract

The learning problem of continuous hidden Markov models (CHMMs) is the most critical and challenging one for the application of CHMMs. This paper aims to attack the learning problem of CHMMs by using the diversified gradient descent (DGD) algorithm. The novel learning formula of CHMM parameters, requiring no special form of the objective function and yielding various parameter estimates with different degree of diversity, is derived through dynamically adjusting the iterative procedure according to the gradient change of each parameter. It is the first work for standard CHMM attempting to obtain more local maxima so that the global maximum of the likelihood function of CHMM can be better approximated or even discovered. Hence this paper takes an important step forward in solving the learning problem of CHMM. Furthermore, a likelihood-based model averaging (LBMA) estimator is developed to achieve robust parameter estimation of CHMM based upon the diversiform models attained by the DGD algorithm. The proposed methods are tested on simulation and real-life bearing fault diagnosis problem. The results show that proposed methods perform better in parameter estimation and bearing fault diagnosis compared to the conventional methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.