Assistive robotic platforms have recently gained popularity in various healthcare applications, and their use has expanded to social settings such as education, tourism, and manufacturing. These social robots, often in the form of bio-inspired humanoid systems, provide significant psychological and physiological benefits through one-on-one interactions. To optimize the interaction between social robotic platforms and humans, it is crucial for these robots to identify and mimic human motions in real time. This research presents a motion prediction model developed using convolutional neural networks (CNNs) to efficiently determine the type of motions at the initial state. Once identified, the corresponding reactions of the robots are executed by moving their joints along specific trajectories derived through temporal alignment and stored in a pre-selected motion library. In this study, we developed a multi-axial robotic arm integrated with a motion identification model to interact with humans by emulating their movements. The robotic arm follows pre-selected trajectories for corresponding interactions, which are generated based on identified human motions. To address the nonlinearities and cross-coupled dynamics of the robotic system, we applied a control strategy for precise motion tracking. This integrated system ensures that the robotic arm can achieve adequate controlled outcomes, thus validating the feasibility of such an interactive robotic system in providing effective bio-inspired motion emulation.
Read full abstract