Abstract

This paper describes a speaker independent word recognition algorithm that is based on four layer neural networks with embedded eigenvectors. Eigenvectors from the sub-space method (SM) are used as weights for the first hidden layer. A similarity measure given by SM is calculated by cumulative summation of the projection components of aninput pattern onto a set of eigenvectors. In contrast to this, our new method evaluates each projection component to achieve better performance than SM. We propose the subspace training (SST) algorithm with SM and the decision controlled back propaga-tion training (DCBPT) algorithm to improve recognition performance and to reduce training times. Training and recognition experiments were performed using a 26 word vocabulary consisting of train station names. The error rate was 1.3% using SM and was reduced to 0.7% using the combination of neural networks and SM.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.