Abstract

Reducing redundancy in search has been a major concern for automated deduction. Subgoal-reduction strategies prevent redundant search by using lemmaizing and caching, whereas contraction-based strategies prevent redundant search by using contraction rules, such as subsumption. In this work we show that lemmaizing and contraction can coexist in the framework of semantic resolution. On the lemmaizing side, we define two meta-level inference rules for lemmaizing in semantic resolution, one for unit and one for non-unit lemmas, and we prove their soundness. Rules for lemmaizing are meta-rules because they use global knowledge about the derivation, e.g. ancestry relations, in order to derive lemmas. On the contraction side, we give contraction rules for semantic strategies, and we define a purity deletion rule for first-order clauses that preserves completeness. While lemmaizing generalizes success caching of model elimination, purity deletion echoes failure caching. Thus, our approach integrates features of backward and forward reasoning.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call