Abstract

This paper describes a speaker independent word recognition algorithm that is based on four layer neural networks with embedded eigenvectors. Eigenvectors from the sub-space method (SM) are used as weights for the first hidden layer. A similarity measure given by SM is calculated by cumulative summation of the projection components of aninput pattern onto a set of eigenvectors. In contrast to this, our new method evaluates each projection component to achieve better performance than SM. We propose the subspace training (SST) algorithm with SM and the decision controlled back propaga-tion training (DCBPT) algorithm to improve recognition performance and to reduce training times. Training and recognition experiments were performed using a 26 word vocabulary consisting of train station names. The error rate was 1.3% using SM and was reduced to 0.7% using the combination of neural networks and SM.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call