Consider a large-format display before the user, bearing a multiplicity of “windows,” like little movies, the majority dynamic and in color. There are upwards of 20 windows, say, more than a person can ordinarily absorb at once. Some of the windows come and go, reflecting their nature as direct TV linkages into real-time, real-world events. Others are non-real-time, some dynamic, others static but capable of jumping into motion. Such an ensemble of information inputs reflects the managerial world of the top-level executive of the not too distant electronic future: a world of brevity, fragmentation, variety, above all one of an overwhelming onslaught of events. The multiplicity and simultaneity of such a display situation ordinarily would make coping with it untenable. The intent of the reported research is to introduce order and control, through the creation of a dynamic, gaze-interactive interface. Making the behavior and reactivity of the “windows” contingent upon measured eyemovements - the point-of-regard of the observer - aims both to help the observer to cope with the onslaught of events on the one hand, yet enable on the other hand continuing close contact with that everchanging ensemble. A simulation of such a world is described and demonstrated in the composite medium of computer, videodisc, and video special effects. Eye-tracking technology, integrated with speech and manual inputs, controls the display's visual dynamics, and orchestrates its sound accompaniments. All elements are combined to form a testbed for the conception generally, and to explore the associated human factors and stagecraft.
Read full abstract