Abstract
Amidst the global energy crisis, the industrial sector is facing unparalleled energy conservation challenges as a primary energy consumer. Industrial demand response (IDR) is a promising strategy to enhance the energy efficiency of manufacturing enterprises. This paper introduces a graph reinforcement learning (GRL)-based method for flexible job shop scheduling under IDR from a production and energy nexus perspective. The scheduling problem is initially formulated as a Markov Decision Process (MDP). On this basis, the scheduling state is represented by a unique heterogeneous disjunctive graph incorporating IDR features, and the reward is constructed through a tailored generalized electricity consumption index (GECI). Moreover, a mixed graph neural network scheduler is designed to tackle this MDP, which integrates the heterogeneous attention mechanism and adaptive greedy sampling strategy. Furthermore, a proximal policy optimization algorithm is employed for training to obtain optimal scheduling schemes. Empirical case studies indicate that the proposed method can reduce GECI by up to 14.44% and 2.22%, respectively, outperforming existing well-known dispatching rules and another scheduling method based on GRL.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.