Abstract

We introduce an inferential approach to unsupervised learning which allows us to define an optimal learning strategy. Applying these ideas to a simple, previously studied model, we show that it is impossible to detect structure in data until a critical number of examples have been presented-an effect which will be observed in all problems with certain underlying symmetries. Thereafter, the advantage of optimal learning over previously studied learning algorithms depends critically upon the distribution of patterns; optimal learning may be exponentially faster. Models with more subtle correlations are harder to analyse, but in a simple limit of one such problem we calculate exactly the efficacy of an algorithm similar to some used in practice, and compare it to that of the optimal prescription.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call