Abstract
Hidden Markov Models (HMM) are used in a wide range of artifificial intelligence applications including speech recognition, computer vision, computational biology and fifinance. Estimating an HMM parameters is often addressed via the Baum-Welch algorithm (BWA), but this algorithm tends to convergence to local optimum of the model parameters. Therefore, optimizing HMM parameters remains a crucial and challenging work. In this paper, a Variable Neighborhood Search (VNS) combined with Baum-Welch algorithm (VNS-BWA) is proposed. The idea is to use VNS to escape from local minima, enable greater exploration of the search space, and enhance the learning capability of HMMs models. The proposed algorithm has entire advantage of combination of the search mechanism in VNS algorithm for training with no gradient information, and the BWA algorithm that utilizes this kind of knowledge. The performance of the proposed method is validated on a real dataset. The results show that the VNS-BWA has better performance fifinding the optimal parameters of HMM models, enhancing its learning capability and classifification performance.
Highlights
Hidden Markov models have been known as a statistical model with great success and widely used in a vast range of application fields such computational biology, speech processing, pattern recognition, and finance [1, 2, 3, 4, 5, 6, 7] among other disciplines.Attempts to estimate Hidden Markov Models (HMM) parameters were carried out by several authors [9, 10, 11, 12, 13], but the dominant approach to HMM parameter estimation problem is the Baum-Welch algorithm [8], despite its use in practice, the Baum-Welch algorithm can get trapped in local optima of the model parameters
This paper proposed Variable Neighborhood Search (VNS)-Baum-Welch algorithm (BWA) as a new approach for hidden markov models training
Where VNS is used in combination with BWA to explores the search space for the optimal parameter structure of HMMs
Summary
Hidden Markov models have been known as a statistical model with great success and widely used in a vast range of application fields such computational biology, speech processing, pattern recognition, and finance [1, 2, 3, 4, 5, 6, 7] among other disciplines.Attempts to estimate HMMs parameters were carried out by several authors [9, 10, 11, 12, 13], but the dominant approach to HMM parameter estimation problem is the Baum-Welch algorithm [8], despite its use in practice, the Baum-Welch algorithm can get trapped in local optima of the model parameters. There is need for an algorithm that can escape from the local optimum and probe the solution space to reach the global optimum of the model parameters. Variable neighborhood search [15] is among the class of metaheuristics that have provided optimal solutions in many different problem domains, by considering changes of neighborhood in both, the descent phase (to find a local neighborhood optimum), and therfore the perturbation phase (to get out of the corresponding basin of attraction). VNS has been successfully applied to a wide variety of optimization problems such the clustered vehicle problem, the maximum min-sum dispersion problem or the financial derivative problem [16, 17, 18, 19], just to cite several recent works.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Statistics, Optimization & Information Computing
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.