Abstract

In this paper we extend the Monotone Theory to the PAC-learning Model with membership queries. Using this extention we show that a DNF formula that has at least one “1/ poly -heavy” clause in one of its CNF representation (a clause that is not satisfied with probability 1/ poly ( n , s ) where n is the number of variables and s is the number of terms in f ) with respect to a distribution D is weakly learnable under this distribution. So DNF that are not weakly learnable under the distribution D do not have any “1/ poly -heavy” clauses in any of their CNF representations. A DNF f is called τ -CDNF if there is τ ′ > τ and a CNF representation of f that contains poly ( n , s ) clauses that τ ′ -approximates f according to a distribution D . We show that the class of all τ -CDNF is weakly ( τ + ϵ )-PAC-learnable with membership queries under the distribution D . We then show how to change our algorithm to a parallel algorithm that runs in polylogarithmic time with a polynomial number of processors. In particular, decision trees are (strongly) PAC-learnable with membership queries under any distribution in parallel in polylogarithmic time with a polynomial number of processors. Finally we show that no efficient parallel exact learning algorithm exists for decision trees.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.