Abstract

We study the joint scheduling of deferrable demands (e.g., the charging of electric vehicles) and storage systems in the presence of random supply, demand arrivals, processing costs, and subject to processing rate limit constraint. We formulate the scheduling problem as a dynamic program so as to minimize the expected total cost, the sum of processing costs, and the noncompletion penalty (incurred when a task is not fully processed by its deadline). Under mild assumptions, we characterize an optimal index-based priority rule: Tasks with less laxity should be processed first, and for two tasks with the same laxity, the task with a later deadline has the priority. Based on the established optimal control policy characterizations (on resource allocation among multitasks and storage operation), we propose to apply data-driven reinforcement learning (RL) methods to make energy procurement decisions. Numerical results show that the proposed approach significantly outperforms existing RL methods combined with the earliest deadline first priority rule (by reducing 26%–32% of system cost).

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.