Abstract

This paper is a survey of the many applications of the theory of parallel algorithms and complexity to logic programming problems. The mathematical tools relevant to the analysis of parallel logic programs include: the concept of alternation, the complexity class NC, and parallel algorithms using randomization and/or achieving optimal speed-ups. The logic programming problems addressed are related to query optimization for deductive databases and to fast parallel execution of primitive operations in logic programming languages, such as, fixpoint operators, term unification, and term matching. The formal language we use to illustrate the addition of recursion to database queries is the language of logical query programs (or programs), whose evaluation has been the object of much recent research. Our presentation highlights the fact that all of these results can be stated using the stage function sH(n) of a program H, where n is the size of the database queried. In this context we also present two new observations. (1) Given a linear single rule program H, it is NP-hard to decide whether sH(n)=O(1); this extends the bounded recursion analysis of[57, 42]. (2) There is a program that can be evaluated in s(n)=O(log n) stages, but whose derivation trees (of width 1) are of size exponential in n. An extension of derivation trees from width 1 to constant width properly strengthens the fast evaluation method of [76].

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call