Abstract

We investigate a model of polynomial-time concept prediction which is a relaxation of the distribution-independent model of concept learning due to Valiant. Prediction-preserving reductions are defined and are shown to be effective tools for comparing the relative difficulty of solving various prediction problems. A number of prediction-preserving reductions are given. For example, if deterministic finite automata are polynomially predictable, then so are all Boolean formulas. We develop a complexity theory for prediction problems that parallels standard complexity theory. It is shown that certain problems of concept prediction are “prediction-complete” for a complexity class—a polynomial time algorithm for the prediction problem would imply that all languages in the complexity class are polynomially predictable. For example, polynomial-time prediction of deterministic finite automata implies the polynomial predictability of all languages in the class LOG (deterministic logspace). Similar natural prediction-complete problems are given for the standard complexity classes NC 1, NLOG, LOGCFL, and P. Showing that a prediction problem is prediction-complete for any of these classes provides varying degrees of evidence that no efficient prediction algorithm exists for the problem. Based on very weak cryptographic assumptions, we establish hardness results for prediction of Boolean circuits and other prediction problems that are prediction-complete for P. The recent related resuts of Kearns and Valiant are discussed, which show that Boolean formulas and DFAs are not polynomially predictable based on the assumed intractability of computing specific cryptographic functions.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.