Abstract

This paper presents a data-driven approach that leverages reinforcement learning to manage the optimal energy consumption of a smart home with a rooftop solar photovoltaic system, energy storage system, and smart home appliances. Compared to existing model-based optimization methods for home energy management systems, the novelty of the proposed approach is as follows: (1) a model-free Q-learning method is applied to energy consumption scheduling for an individual controllable home appliance (air conditioner or washing machine), as well as the energy storage system charging and discharging, and (2) the prediction of the indoor temperature using an artificial neural network assists the proposed Q-learning algorithm in learning the relationship between the indoor temperature and energy consumption of the air conditioner accurately. The proposed Q-learning home energy management algorithm, integrated with the artificial neural network model, reduces the consumer electricity bill within the preferred comfort level (such as the indoor temperature) and the appliance operation characteristics. The simulations illustrate a single home with a solar photovoltaic system, an air conditioner, a washing machine, and an energy storage system with the time-of-use pricing. The results show that the relative electricity bill reduction of the proposed algorithm over the existing optimization approach is 14%.

Highlights

  • With the advent of the Internet of Things (IoT) technology, smart sensors, and advanced communication and control methods in electric energy systems, increasing amounts of electric energy-related data are being produced and utilized for the reliable and efficient operation of electric energy system

  • Various studies have been conducted on the home energy management systems (HEMSs) optimization formulation in different types of optimization models and performance assessment [4,5,6,7,8,9,10,11,12,13,14,15,16]. These approaches include the scheduling of different types of home appliances along with electric vehicles using linear programming (LP) [4,5], load scheduling considering the consumer comfort level using mixed integer nonlinear programming (MINLP) [6], convex programming based on relaxed

  • We have proposed a machine learning-based smart home energy management algorithm using reinforcement learning and an artificial neural network

Read more

Summary

Introduction

With the advent of the Internet of Things (IoT) technology, smart sensors, and advanced communication and control methods in electric energy systems, increasing amounts of electric energy-related data are being produced and utilized for the reliable and efficient operation of electric energy system. We present an RL-based HEMS model that manages the optimal energy consumption of a smart home with a rooftop PV system, ESS, and smart home appliances. In the HEMS model, the Q-learning method is applied to the energy consumption scheduling of different home appliances (air conditioner, washing machine, and ESS), whereby the agent of each appliance determines the optimal policy independently to reduce its own electric cost within the consumer comfort level and the appliance operation characteristics. The simulation results confirm that the proposed RL method with the ANN can successfully reduce both the consumer electricity bill and dissatisfaction cost (for example, the indoor temperature and operating time interval of the washing machine within the consumer comfort settings).

Related Research
Preliminary
Conventional HEMS Optimization Formulation
Net Power Consumption
Operating Characteristics for Controllable Appliances
Home Energy Management via Q-Learning
State Space
Action Space
Reward
Prediction of Indoor Temperature via ANN
Simulation Setup
Performance of the Proposed RL-Based HEMS Algorithm
Impact of Different Parameters in Reward Function on the Proposed Algorithm
Impact of ANN on AC Agent Performance
Performance Comparison between MILP- and RL-Based HEMS
Discussion
Findings
Constraint of the Lifetime for ESS
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call