Abstract

Often, the best artificial neural network to solve a real-world problem is relatively complex. However, with the growing popularity of smaller computing devices (hand-held computers, cellular telephones, automobile interfaces, etc.), there is a need for simpler models with comparable accuracy. This paper presents evidence that using a larger model as an oracle to train a smaller model on unlabeled data results in (1) a simpler acceptable model and (2) improved results over standard training methods on a similarly-sized smaller model. On automated spoken-digit recognition, oracle learning resulted in an artificial neural network of half the size that (1) maintained comparable accuracy to the larger neural network and (2) obtained up to a 25% decrease in error over standard training methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call