Abstract

The working hypothesis in this project is that gaze interactions play a central role in structuring the joint control and guidance strategy of the human operator performing spatial tasks. Perceptual guidance and control is the idea that the visual and motor systems form a unified perceptuo-motor system where necessary information is naturally extracted by the visual system. As a consequence, the response of this system is constrained by the visual and motor mechanisms and these effects should manifest in the behavioral data. Modeling the perceptual processes of the human operator provides the foundation necessary for a systems-based approach to the design of control and display systems used by remotely operated vehicles. This paper investigates this hypothesis using flight tasks conducted with remotely controlled miniature rotorcraft, taking place in indoor settings that provide rich environments to investigate the key processes supporting spatial interactions. This work also applies to spatial control tasks in a range of application domains that include tele-operation, gaming, and virtual reality. The human-in-the-loop system combines the dynamics of the vehicle, environment, and human perception–action with the response of the overall system emerging from the interplay of perception and action. The main questions to be answered in this work are as follows: (i) what is the general control and guidance strategy of the human operator, and (ii) how is information about the vehicle and environment extracted visually by the operator. The general approach uses gaze as the primary sensory mechanism by decoding the gaze patterns of the pilot to provide information for estimation, control, and guidance. This work differs from existing research by taking what have largely been conceptual ideas on action–perception and structuring them to be implemented for a real-world problem. The paper proposes a system model that captures the human pilot’s perception–action loop; the loop that delineates the main components of the pilot’s perceptuo-motor system, including estimation of the vehicle state and task elements based on operator gaze patterns, trajectory planning, and tracking control. The identified human visuo-motor model is then exploited to demonstrate how the perceptual and control functions system can be augmented to reduce the operator workload.

Highlights

  • Recent years have seen rapid advances in fields such as robotics and sensor technology that are fundamentally changing the way in which humans interact with the world

  • Improved robotics technology has led to an expanding number of applications that range from self-driving cars [1,2], to robotic-assisted surgery [3], and further to the wide availability of small-scale unmanned aerial vehicles [4]

  • The present paper models the interactions of the operator’s gaze and control behavior, following the hypothesis that human remote control is conditioned by gaze dynamics and other functional constraints in particular perceptual guidance mechanisms

Read more

Summary

Introduction

Recent years have seen rapid advances in fields such as robotics and sensor technology that are fundamentally changing the way in which humans interact with the world. Sensor capabilities have advanced and can provide inexpensive measurements of human gaze and body motion. Combining these technologies allows for the Sensors 2018, 18, 2979; doi:10.3390/s18092979 www.mdpi.com/journal/sensors. A systematic modeling approach utilizes the data captured from experimental flight tests to characterize the human pilot’s interaction with the vehicle and environment. Autonomous or semi-autonomous operations have made important progress, but for the foreseeable future human teleoperation will remain a critical modality, in particular for interactive tasks such as surgery or vehicle operation in complex environments.

Objectives
Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.