Abstract

In our current research into the design of cognitively well-motivated interfaces relying primarily on the display of graphical information, we have observed that graphical information alone does not provide sufficient support to users --- particularly when situations arise that do not simply conform to the users' expectations. This can occur due to too much information being requested, too little, information of the wrong kind, etc. To solve this problem, we are working towards the integration of natural language generation to augment the interaction functionalities of the interface. This is intended to support the generation of flexible natural language utterances which pinpoint possible problems with a user's request and which further go on to outline the user's most sensible courses of action away from the problem. In this paper, we describe our first prototype, where we combine the graphical and interaction planning capabilities of our graphical information system SIC! with the text generation capabilities of the Penman system. We illustrate the need for such a combined system, and also give examples of how a general natural language facility beneficially augments the user's ability to navigate a knowledge base graphically.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.