Abstract: Shared Control, where the machine and the human share tasks and control the situation together, and its extension cooperative automation are promising approaches to overcome automation-induced problems, such as lack of situation awareness and degradation of skill. However, the design of Shared Controllers and/or cooperative human-machine systems should be done in a very careful manner. One of the major issues is conflicts between the human and the machine: how to detect these conflicts, and how to resolve them, if necessary? A complicating factor is that when the human is right, conflicts are undesirable (resulting in nuisance, degraded performance, etc), but when the machine is right, conflicts are desirable (warning the operator, or proper assistance or overruling). Research has pointed out several types and causes of conflicts, but offers no coherent framework for design and evaluation guidelines. In this paper, we propose such a theoretical framework in order to structure and relate different types of conflicts. The framework is inspired by a hierarchical task analysis, and identifies five possible sources of conflicts: intent, information gathering, information processing, decision-making and action implementation. Examples of conflicts in several application domains such as automobile, telerobotics, and surgery are discussed to illustrate the applicability of this framework.
Read full abstract