The high-quality development of the manufacturing industry necessitates accelerating its transformation towards high-end, intelligent, and green development. Considering logistics resource constraints, the impact of dynamic disturbance events on production, and the need for energy-efficient production, the integrated scheduling of production equipment and automated guided vehicles (AGVs) in a flexible job shop environment is investigated in this study. Firstly, a static model for the integrated scheduling of production equipment and AGVs (ISPEA) is developed based on mixed-integer programming, which aims to optimize the maximum completion time and total production energy consumption (EC). In recent years, reinforcement learning, including deep reinforcement learning (DRL), has demonstrated significant advantages in handling workshop scheduling issues with sequential decision-making characteristics, which can fully utilize the vast quantity of historical data accumulated in the workshop and adjust production plans in a timely manner based on changes in production conditions and demand. Accordingly, a DRL-based approach is introduced to address the common production disturbances in emergency order insertions. Combined with the characteristics of the ISPEA problem and an event-driven strategy for handling dynamic events, four types of agents, namely workpiece selection, machine selection, AGV selection, and target selection agents, are set up, which refine workshop production status characteristics as observation inputs and generate rules for selecting workpieces, machines, AGVs, and targets. These agents are trained offline using the QMIX multi-agent reinforcement learning framework, and the trained agents are utilized to solve the dynamic ISPEA problem. Finally, the effectiveness of the proposed model and method is validated through a comparison of the solution performance with other typical optimization algorithms for various cases.
Read full abstract