Abstract

Soft mobile robots have shown great potential in unstructured and confined environments by taking advantage of their excellent adaptability and high dexterity. However, there are several issues to be addressed, such as actuating speeds and controllability, in soft robots. In this letter, a new vibration actuator is proposed using the nonlinear stiffness characteristic of a hyperelastic material, which creates continuous vibration of the actuator. By integrating three proposed actuators, we also present an advanced soft mobile robot with high degrees of freedom of movement. However, since the dynamic model of the soft mobile robot is generally hard to obtain(intractable), it is difficult to design a controller for the robot. In this regard, we present a method to train a controller, using a novel reinforcement learning (RL) algorithm called adaptive soft actor-critic (ASAC). ASAC gradually reduces a parameter called an entropy temperature, which regulates the entropy of the control policy. In this way, the proposed method can narrow down the search space during training, and reduce the duration of demanding data collection processes in real-world experiments. For the verification of the robustness and the controllability of our robot and the RL algorithm, experiments for zig-zagging path tracking and obstacle avoidance were conducted, and the robot successfully finished the missions with only an hour of training time.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call