Abstract

Manufacturing challenges are increasing the demands for more agile and dexterous means of production. At the same time, these systems aim to maintain or even increase productivity. The challenges risen from these developments can be tackled through human–robot collaboration (HRC). HRC requires effective task distribution according to each party’s distinctive strengths, which is envisioned to generate synergetic effects. To enable a seamless collaboration, the human and robot require a mutual awareness, which is challenging, due to the human and robot “speaking” different languages as in analogue and digital. This challenge can be addressed by equipping the robot with a model of the human. Despite a range of models being available, data-driven models of the human are still at an early stage. For this purpose, this paper proposes an adaptive human sensor framework, which incorporates objective, subjective, and physiological metrics, as well as associated machine learning. Thus, it is envisioned to adapt to the uniqueness and dynamic nature of human behavior. To test the framework, a validation experiment was performed, including 18 participants, which aims to predict perceived workload during two scenarios, namely a manual and an HRC assembly task. Perceived workloads are described to have a substantial impact on a human operator’s task performance. Throughout the experiment, physiological data from an electroencephalogram (EEG), an electrocardiogram (ECG), and respiration sensor was collected and interpreted. For subjective metrics, the standardized NASA Task Load Index was used. Objective metrics included task completion time and number of errors/assistance requests. Overall, the framework revealed a promising potential towards an adaptive behavior, which is ultimately envisioned to enable a more effective HRC.

Highlights

  • The manufacturing landscape is moving towards the production of customized and personalized products [1]

  • This paper presents a novel framework to establish datadriven models of the human for human–robot collaboration (HRC)

  • Task scheduling and allocation is focused on dispatching tasks according to availability of a resource, and on reaching an optimum based on different criteria such as assigning the most skilled entity, product requirements, and even energy consumption [11, 12]

Read more

Summary

Introduction

The manufacturing landscape is moving towards the production of customized and personalized products [1]. Due to the inability to maximize productivity of manual assembly systems, there is a strong motivation to increase the level of automation in these domains [6] This is intended to overcome weaknesses associated with human workers such as being susceptible to high workload, fatigue, and stress [4]. In contrast to fully automated systems, the human worker introduces a new level of uncertainty and unpredictability [3] This includes varying levels of worker’s expertise, current state including health and fatigue, as well as comfort and ergonomic requirements [13]. These requirements are expected to lead to a constant and dynamic adjustment in task assignments, as well as during the execution of a task (adapted robot behavior) [5]. This enforces the need for accurate models of the human, which allow the collaborative robot to better understand its human partner

Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.