Abstract

The mobile energy storage system (MESS) plays an increasingly important role in energy systems because of its spatial and temporal flexibilities, while the high upfront investment cost requires developing corresponding operation and arbitrage strategies. In the existing literature, the MESS arbitrage problems are usually cast as mixed-integer programming models. However, the performance of this model-based method is deteriorated by the uncertainties of power and transportation networks and the complicated operational characteristics of batteries. To overcome the deficiencies of existing methods, this article proposes a data-driven uncertainty-adaptive MESS arbitrage method considering MESS mobility rules, battery degradation, and operational efficiencies. A two-layer deep reinforcement learning (DRL) method is developed to obtain the discrete mobility and continuous charging or discharging power, and a sequential training strategy is designed to accelerate the convergence of model training. The proposed method is tested using the real-world electricity prices and traffic information of charging stations. Compared with traditional model-based methods that rely on entire and accurate future information, the proposed DRL method obtains high arbitrage profits by learning arbitrage strategies from historical data and making effective decisions with limited real-time information.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call