Abstract

We consider the complexity of properly learning concept classes, i.e. when the learner must output a hypothesis of the same form as the unknown concept. We present the following new upper and lower bounds on well-known concept classes: • We show that unless NP = RP , there is no polynomial-time PAC learning algorithm for DNF formulas where the hypothesis is an OR-of-thresholds. Note that as special cases, we show that neither DNF nor OR-of-thresholds are properly learnable unless NP = RP . Previous hardness results have required strong restrictions on the size of the output DNF formula. We also prove that it is NP -hard to learn the intersection of ℓ ⩾ 2 halfspaces by the intersection of k halfspaces for any constant k ⩾ 0 . Previous work held for the case when k = ℓ . • Assuming that NP ⊈ DTIME ( 2 n ϵ ) for a certain constant ϵ < 1 we show that it is not possible to learn size s decision trees by size s k decision trees for any k ⩾ 0 . Previous hardness results for learning decision trees held for k ⩽ 2 . • We present the first non-trivial upper bounds on properly learning DNF formulas. More specifically, we show how to learn size s DNF by DNF in time 2 O ˜ ( n log s ) . The hardness results for DNF formulas and intersections of halfspaces are obtained via specialized graph products for amplifying the hardness of approximating the chromatic number as well as applying recent work on the hardness of approximate hypergraph coloring. The hardness results for decision trees, as well as the new upper bounds, are obtained by developing a connection between automatizability in proof complexity and learnability, which may have other applications.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call