Abstract

We study the proper learnability of axis-parallel concept classes in the PAC-learning and exact-learning models. These classes include union of boxes, DNF, decision trees and multivariate polynomials. For constant-dimensional axis-parallel concepts C we show that the following problems have time complexities that are within a polynomial factor of each other. C is α-properly exactly learnable (with hypotheses of size at most α times the target size) from membership and equivalence queries. C is α-properly PAC learnable (without membership queries) under any product distribution. There is an α-approximation algorithm for the MINEQUIC problem (given a g ∈ C find a minimal size f ∈ C that is logically equivalent to g). In particular, if one has polynomial time complexity, they all do. Using this we give the first proper-learning algorithm of constant-dimensional decision trees and the first negative results in proper learning from membership and equivalence queries for many classes. For axis-parallel concepts over a nonconstant dimension we show that with the equivalence oracle (1) ⇒ (3). We use this to show that (binary) decision trees are not properly learnable in polynomial time (assuming P ≠ NP) and DNF is not se-properly learnable (e < 1) in polynomial time even with an NP-oracle (assuming Σ2P ≠ PNP).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call