Abstract

Natural language communication interfaces have usually employed linear strings of words for man-machine communication. A lot of ‘intelligence’—in the form of semantic, syntactic and other information—is used to analyse these strings, and to puzzle out their structures. However, use of linear strings of words, while appropriate for communication between humans, seems inappropriate for communication with a machine using video displays, keyboards and a mouse. One need not demand too much out of machines in this area of analysis of natural language input; one could bypass these problems by using alternative approaches to man-machine communication. Otte such approach is described in this paper, for the communication of the content and structure of natural language sentences. The basic idea is that the human user of the interface should use the two dimensional screen, mouse and keyboard to create structures for input, guided by appropriate software. Another key idea is the use of a high degree of interaction to avoid some problems usually encountered in natural language understanding. Based on this approach, a system called ScreenTalk has been implemented in Common LISP on a VAX workstation. The man-machine interface is used to interactively input both the content and the structure of sentences. Users may then ask questions, which are answered using stored information. ScreenTalk now operates on a database of brief news items. It has been designed to be fairly domain independent, and is expected to be used soon in other applications. The conceptual framework for such an approach, the design of the experimental interface used to test this framework and the authors' experience with this interface are presented.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call