Abstract

Hidden Markov model (HMM) is currently the most popular approach to speech recognition. The problem of optimizing model parameters is of great interest to the researchers in this area. Genetic algorithm (GA) has been used in the optimization of HMM. However, GA lacks hill-climbing capacity. A novel GA based on Tabu search (TS) called GATS is brought forward, which maintains the merits of GA and TS. Furthermore, combining the Baum-Welch algorithm with the GATS algorithm, a hybrid algorithm named GATSBW is proposed to train the continuous HMM in continuous speech recognition. The GATSBW algorithm not only overcomes the shortcoming of the slow convergence speed of the GATS algorithm but also helps the Baum-Welch algorithm escape from local optimum. The experimental results show that the GATS algorithm has stronger hill-climbing ability than GA and the GATSBW algorithm is superior to the Baum-Welch algorithm in terms of the recognition ability generally.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.