Abstract

Extending the length of time series forecasting has a long-term impact on smart grid energy consumption planning, residential electricity monitoring, extreme weather warning, and other real applications. This article studies the reliable long-term load trend forecasting approaches in smart grid environment. In recent years, the improved neural networks models based on self-attention mechanism show good performances in many sequence tasks, but most studies focus on reducing the complexity of networks layer, and can not restrain the increase of calculation error in longer distance forecasting scenarios. Also, most models lack the ability to mine potential high-dimensional features of time series. Based on these problems, we design a reliable hierarchical self-attention model named as long-term stability network (LTSNet), which adopts a tree-shaped decomposition neural network architecture based on hierarchical residual self-attention blocks, to top-down incrementally mine high-dimensional features of temporal components. At the same time, the attention matrix is used for feature interaction at each layer, to reduce the distribution gap between time series fragments in different domains. Compared with existing study models, LTSNet maintains stable forecasting performance and speed in long-term forecasting services, achieved the most reliable multivariate and univariate forecasting results in multiple domain scenarios. Compared with the latest models, the forecasting accuracy of the proposed model is improved by 22.8% and 13.8%, respectively, covering three applications: energy consumption, residential electricity consumption, and weather forecasting. At the same time, our experiments verify that the hierarchical decomposition networks can be used as a backbone architecture to effectively extended to longer dimensional load trend forecasting scenarios.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.