Reducing carbon emissions is a critical issue for the near future as climate change is an imminent reality. To reduce our carbon footprint, society must change its habits and behaviours to optimise energy consumption, and the current progress in embedded systems and artificial intelligence has the potential to make this easier. The smart building concept and intelligent energy management are key points to increase the use of renewable sources of energy as opposed to fossil fuels. In addition, cyber-physical systems (CPSs) provide an abstraction of the management of services that allows the integration of both virtual and physical systems in a seamless control architecture. In this paper, we propose to use multiagent reinforcement learning (MARL) to model the CPS services control plane in a smart house, with the purpose of minimising, by shifting or shutdown services, the use of non-renewable energy (fuel generator) by exploiting solar production and batteries. Furthermore, our proposal dynamically adapts its behaviour in real time according to current and historic energy production, thus being able to handle occasional changes in energy production due to meteorological phenomena or unexpected energy consumption. In order to evaluate our proposal, we have developed an open-source smart building energy simulator and deployed our use case. Finally, several simulations with different configurations are evaluated to verify the performance. The simulation results show that the reinforcement learning solution outperformed the priority-based and the heuristic-based solutions in both power consumption and adaptability in all configurations.
Read full abstract