Abstract
Human performance in natural environments is deeply impressive, and still much beyond current AI. Experimental techniques, such as eye tracking, may be useful to understand the cognitive basis of this performance, and “the human advantage.” Driving is domain where these techniques may deployed, in tasks ranging from rigorously controlled laboratory settings through high-fidelity simulations to naturalistic experiments in the wild. This research has revealed robust patterns that can be reliably identified and replicated in the field and reproduced in the lab. The purpose of this review is to cover the basics of what is known about these gaze behaviors, and some of their implications for understanding visually guided steering. The phenomena reviewed will be of interest to those working on any domain where visual guidance and control with similar task demands is involved (e.g., many sports). The paper is intended to be accessible to the non-specialist, without oversimplifying the complexity of real-world visual behavior. The literature reviewed will provide an information base useful for researchers working on oculomotor behaviors and physiology in the lab who wish to extend their research into more naturalistic locomotor tasks, or researchers in more applied fields (sports, transportation) who wish to bring aspects of the real-world ecology under experimental scrutiny. Part of a Research Topic on Gaze Strategies in Closed Self-paced tasks, this aspect of the driving task is discussed. It is in particular emphasized why it is important to carefully separate the visual strategies driving (quite closed and self-paced) from visual behaviors relevant to other forms of driver behavior (an open-ended menagerie of behaviors). There is always a balance to strike between ecological complexity and experimental control. One way to reconcile these demands is to look for natural, real-world tasks and behavior that are rich enough to be interesting yet sufficiently constrained and well-understood to be replicated in simulators and the lab. This ecological approach to driving as a model behavior and the way the connection between “lab” and “real world” can be spanned in this research is of interest to anyone keen to develop more ecologically representative designs for studying human gaze behavior.
Highlights
Human behavior in the natural world is deeply impressive
The literature reviewed here will provide core readings for researchers working on oculomotor behaviors and physiology in the lab who wish to extend their research into driving, or other naturalistic visual-locomotor tasks, or on the other hand researchers in more applied fields who are interested in bringing aspects of the real-world ecology under experimental scrutiny
Understanding how humans and other animals cope with real-world task demands is one key component in figuring out “the human advantage” we still hold over machines in natural task domains
Summary
Human behavior in the natural world is deeply impressive. Walking in a crowd, bicycling, or driving are carried out with an everyday ease that belies the fact they are underpinned by sophisticated cognitive mechanisms. Humans rule—in computer games and board games it is the machines that vastly outperform humans (as long as the machine is not required to physically move the pieces!) This “human advantage” suggests the human brain has discovered— in evolution and individual development—strategies and techniques for organizing perception and action that well fit our natural ecology, but which may be different from current AI. Through an ecological approach of reproducing and studying in the lab visual strategies that demonstrably occur in natural task domains, and which have been adapted to deal with complexity and ambiguity of the real world (not just the experimental task) one may hope to discover mechanisms and principles underlying “the human advantage.”. What has been found out by such research so far?
Published Version (
Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have