Abstract

In this chapter, we consider the issue of Hidden Markov Model (HMM) training. First, HMMs are introduced and then we focus on the particular HMM training problem. We emphasize the difficulty of this problem and present various criteria that can be considered.Many different adaptations of metaheuristics have been used but, until now, few extensive comparisons have been performed for this problem. We propose to compare three population-based metaheuristics (genetic algorithm, ant algorithm and particle swarm optimization) with and without the help of a local optimizer. These algorithms make use of solutions that can be explored in three different kinds of search space (a constrained space, a discrete space and a vector space). We study these algorithms from both a theoretical and an experimental perspective: parameter settings are fully studied on a reduced set of data and the performances of algorithms are compared on different sets of real data.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.