Abstract

Integrating knowledge from various sources is a recurring problem in Artificial Intelligence, often addressed by multi-context systems (MCSs). Existing MCSs however have limited support for the open-world semantics of knowledge bases (KBs) expressed in knowledge representation languages based on first-order logic. To address this problem we introduce knowledge base networks (KBNs), which consist of open-world KBs linked by non-monotonic bridge rules under a stable model semantics. Basic entailment in KBNs is decidable whenever it is in the individual KBs. This is due to a fundamental representation theorem, which allows us to derive complexity results, and also gives a perspective for implementation. In particular, for networks of KBs in well-known Description Logics (DLs), reasoning is reducible to reasoning in nonmonotonic dl-programs. As a by product, we obtain an embedding of a core fragment of Motik and Rosati’s hybrid MKNF KBs, which amount to a special case of KBNs, to dl-programs. We also show that reasoning in networks of ontologies in lightweight DLs is not harder than in answer set programming.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call