Abstract
Inductive logic programming (ILP) is concerned with the problem of inducing concepts represented as logic programs (or Horn clauses) from examples. Top-down inductive learners such as FOIL (Quinlan 1990; Cameron-Jones et al. 1993) learn Horn clauses adding one literal at a time using a hill-climbing search. These learners suffer from local plateaus, where the selection of a conjunction of literals, rather than a single literal, would improve the accuracy of the clause. The problem becomes the search for combinations of literals rather than just single literals. A mechanism to search efficiently through the space of combinations of literals is needed. The FOCL system (Pazzani et al. 1991) solved this problem by giving the concept learner hand-made “relational cliches” which are combinations of literals to consider while learning. The problem is that these cliches are hard to derive and often specific to a domain. So, it would be desirable to learn them automatically. As a part of this thesis, an inductive learner called CLUSE (Cliches Learned and USEd) has been developed that learns combinations of literals called relational cliches. The underlying idea is to learn cliches from examples of a concept and to use them with a hill-climbing learner to escape local plateaus. Cliches are learned from a concept in one domain and used to learn concepts within the same domain as well as across domains. Assuming that cliches are learned and used in the same domain, literals used to express different concepts overlap. Consequently cliches learned from one concept should provide appropriate lookahead to learn concepts in the same domain. On the other hand, these cliches probably have few literals in common with concepts across domains, hence the need for more general cliches. To solve this, CLUSE learns two kinds of cliches: Domain Dependent Cliches expressed as a conjunction of literals specific to a domain, and Domain Independent Cliches where literals have variable predicate symbols. CLUSE is a bottom-up inductive relational learner based on Relative Least General Generalization (RLGG). To remedy the inefficiency and the overgeneralization problems of RLGG, a modified version of RLGG has been developed that exploits the context in which LGG is applied. The modified RLGG is called Contextual Least General Generalization (CLGG). Empirical experiments with CLUSE reveal that cliches learned with CLUSE provide appropriate lookahead to escape local plateaus of a hill-climbing learner both within and across domains. For the purpose of the evaluation, FOIL has been extended to learn concepts with or without cliches. In two domains of application, cliches have proven to be useful. One domain is the real-life application defining structures for the finite element methods (FEM). The other domain is the synthetic domain of blocks, which offers a wide variety of problems (or concepts). Other domains of application such as drug design, text categorization, and detecting traffic problems are also discussed.
Paper version not known (Free)
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have