Abstract

Due to the epidemic threat, more and more companies decide to automate their production lines. Given the lack of adequate security or space, in most cases, such companies cannot use classic production robots. The solution to this problem is the use of collaborative robots (cobots). However, the required equipment (force sensors) or alternative methods of detecting a threat to humans are usually quite expensive. The article presents the practical aspect of collision detection with the use of a simple neural architecture. A virtual force and torque sensor, implemented as a neural network, may be useful in a team of collaborative robots. Four different approaches are compared in this article: auto-regressive (AR), recurrent neural network (RNN), convolutional long short-term memory (CNN-LSTM) and mixed convolutional LSTM network (MC-LSTM). These architectures are analyzed at different levels of input regression (motor current, position, speed, control velocity). This sensor was tested on the original CURA6 robot prototype (Cooperative Universal Robotic Assistant 6) by Intema. The test results indicate that the MC-LSTM architecture is the most effective with the regression level set at 12 samples (at 24 Hz). The mean absolute prediction error obtained by the MC-LSTM architecture was approximately 22 Nm. The conducted external test (72 different signals with collisions) shows that the presented architecture can be used as a collision detector. The MC-LSTM collision detection f1 score with the optimal threshold was 0.85. A well-developed virtual sensor based on such a network can be used to detect various types of collisions of cobot or other mobile or stationary systems operating on the basis of human-machine interaction.

Highlights

  • We live in times when life expectancy is significantly increasing

  • This article proposes a solution to the problem of robot collision detection using a virtual force sensor based on neural networks

  • artificial neural networks (ANNs) cases, the tested Convolutional Network (CNN)-LongShort Term Memory (LSTM) network was built on one convolutional layer (64 filters with Rectified Linear Unit, ReLU, activation function), a max pool layer (2 strides) and the LSTM layer (100 cells)

Read more

Summary

Introduction

We live in times when life expectancy is significantly increasing. Rapidly developing technology and widespread access to medicines make life easier, healthier and longer. They use great strength and speed, which makes them dangerous to humans This trend is evolving, and the industry is increasingly using collaborative robots called cobots, which first appeared in the late 1990s [3]. ISO/TS 15066 states that the maximum speed of the cobot should not exceed 250 mm/s measured in the center of the tool, while the maximum impact force should be less than 65–210 N (depending on the body part) Thanks to such restrictions, they are safer for people. To reduce the costs of the pre-collision approach, one can use RGBD In this application, deep neural networks (e.g., YOLO—You Look Only Once—a real-time object detection system) used to detect an intruder provide good results. One can obtain even better results using neural networks (e.g., OpenPose) for detecting and estimating poselets (a poselet—positioning or position distribution—is a specific description, characteristics or structure, related to the mechanics of the human figure based on pose markers—key marker points, such as points on the feet, ankles, knees, hips, arms, elbows, wrists, neck and head—that suitably define a human pose or gesture) [6,13]

Contribution
The Structure
Problem Formulation
Dynamic Model of a Robot
Collision Monitoring Methods
Torque Estimation
Black Box Model
Dataset—Experimental Setup
ANN Models
RNN Model
CNN-LSTM Model
MC-LSTM Model
Learning Results
Summary
Collision Detection
Statistical Approach
Prediction Approach
Results
Test on an External Dataset
Computing Resources and Timing
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call