Abstract

Strip rolling is a typical manufacturing process, in which conventional control approaches are widely applied. Development of the control algorithms requires a mathematical expression of the process by means of the first principles or empirical models. However, it is difficult to upgrade the conventional control approaches in response to the ever-changing requirements and environmental conditions because domain knowledge of control engineering, mechanical engineering, and material science is required. Reinforcement learning is a machine learning method that can make the agent learn from interacting with the environment, thus avoiding the need for the above mentioned mathematical expression. This paper proposes a novel approach that combines ensemble learning with reinforcement learning methods for strip rolling control. Based on the proximal policy optimization (PPO), a multi-actor PPO is proposed. Each randomly initialized actor interacts with the environment in parallel, but only the experience from the actor that obtains the highest reward is used for updating the actors. Simulation results show that the proposed method outperforms the conventional control methods and the state-of-the-art reinforcement learning methods in terms of process capability and smoothness.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call