Abstract

Most of the Artificial Intelligence (AI) models currently used in energy forecasting are traditional and deterministic. Recently, a novel deep learning paradigm, called ‘transformer’, has been developed, which adopts the mechanism of self-attention. Transformers are designed to better process and predict sequential data sets (i.e., historical time records) as well as to track any relationship in the sequential data. So far, a few transformer-based applications have been established, but no industry-scale application exists to build energy forecasts. Accordingly, this study is the world’s first to establish a transformer-based model to estimate the energy consumption of a real-scale university library and benchmark with a baseline model (Support Vector Regression) SVR. With a large dataset from 1 September 2017 to 13 November 2021 with 30 min granularity, the results using four historical electricity readings to estimate one future reading demonstrate that the SVR (an R2 of 0.92) presents superior performance than the transformer-based model (an R2 of 0.82). Across the sensitivity analysis, the SVR model is more sensitive to the input close to the output. These findings provide new insights into the research area of energy forecasting in either a specific building or a building cluster in a city. The influences of the number of inputs and outputs related to the transformer-based model will be investigated in the future.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call