Abstract

Significant developments in deep learning methods have been achieved with the capability to train more deeper networks. The performance of speech recognition system has been greatly improved by the use of deep learning techniques. Most of the developments in deep learning are associated with the development of new activation functions and the corresponding initializations. The development of Rectified linear units (ReLU) has revolutionized the use of supervised deep learning methods for speech recognition. Recently there has been a great deal of research interest in the development of activation functions Leaky-ReLU (LReLU), Parametric-ReLU (PReLU), Exponential Linear units (ELU) and Parametric-ELU (PELU). This work is aimed at studying the influence of various activation functions on speech recognition system. In this work, a hidden Markov model-Deep neural network (HMM-DNN) based speech recognition is used, where deep neural networks with different activation functions have been employed to obtain the emission probabilities of hidden Markov model. In this work, two datasets i.e., TIMIT and WSJ are employed to study the behavior of various speech recognition systems with different sized datasets. During the study, it is observed that the performance of ReLU-networks is superior compared to the other networks for the smaller sized dataset (i.e., TIMIT dataset). For the datasets of sufficiently larger size (i.e., WSJ) performance of ELU-networks is superior to the other networks.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.