Abstract

Many machine learning algorithms aim at finding “simple” rules to explain training data. The expectation is: the “simpler” the rules, the better the generalization on test data (→ Occam's razor). Most practical implementations, however, use measures for “simplicity” that lack the power, universality and elegance of those based on Kolmogorov complexity and Solomonoff's algorithmic probability. Likewise, most previous approaches (especially those of the “Bayesian” kind) suffer from the problem of choosing appropriate priors. This paper addresses both issues. It first reviews some basic concepts of algorithmic complexity theory relevant to machine learning, and how the Solomonoff-Levin distribution (or universal prior) deals with the prior problem. The universal prior leads to a probabilistic method for finding “algorithmically simple” problem solutions with high generalization capability. The method is based on Levin complexity (a time-bounded extension of Kolmogorov complexity) and inspired by Levin's optimal universal search algorithm. With a given problem, solution candidates are computed by efficient “self-sizing” programs that influence their own runtime and storage size. The probabilistic search algorithm finds the “good” programs (the ones quickly computing algorithmically probable solutions fitting the training data). Experiments focus on the task of discovering “algorithmically simple” neural networks with low Kolmogorov complexity and high generalization capability. These experiments demonstrate that the method, at least with certain toy problems where it is computationally feasible, can lead to generalization results unmatchable by previous neural net algorithms.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.