A flexible production model in which humans and robots collaborate has become an urgent need for industrial production. Therefore, research on human-machine integration systems has received more and more attention in recent years. Human-computer interaction is widely used in industrial production and entertainment life, such as industrial control, VR games, etc. In this article, we designed and implemented a human-machine fusion system that meets different interactive tasks. Human movement estimation and robot interactive control are the key technologies. First, we propose a human posture solution method based on information fusion of multiple cameras and inertial measurement units. This method constructs an optimization problem, fuses the 2D joint detection information under multiple cameras and the measurement information of the worn inertial measurement unit, and optimizes the estimation of human kinematic posture, which improves the problems of incomplete posture information and sensitivity to noise under a single sensor. Therefore, this method improves the accuracy of pose estimation. Secondly, combining the kinematic characteristics of the robot and the characteristics of human-computer interaction, we designed a robot control strategy based on target point tracking and model predictive control, so that the robot can adapt to dynamic environments and different interaction needs by adjusting control parameters. At the same time, our design ensures the safety of robots and operators. Finally, we conducted human-computer interaction experiments such as action following, item delivery, and active obstacle avoidance. The experimental results show the effectiveness and reliability of the designed robot interaction system in a human-machine fusion environment.
Read full abstract