Abstract

Automated Planning techniques can be leveraged to build effective decision support systems that assist the human-in-the-loop. Such systems must provide intuitive explanations when the suggestions made by these systems seem inexplicable to the human. In this regard, we consider scenarios where the user questions the system's suggestion by providing alternatives (referred to as foils). In response, we empower existing decision support technologies to engage in an interactive explanatory dialogue with the user and provide contrastive explanations based on user-specified foils to reach a consensus on proposed decisions. To provide contrastive explanations, we adapt existing techniques in Explainable AI Planning (XAIP). Furthermore, we use this dialog to elicit the user's latent preferences and propose three modes of interaction that use these preferences to provide revised plan suggestions. Finally, we showcase a decision support system that provides all these capabilities.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call