Abstract

Abstract There are currently no methods capable of long-term forecasting based on very long-term real-world data. Any false prediction may damage the electrical transformer. For this problem, a transformer power load based on the Informer model is used as a method of long-term rolling forecasting. This method uses the self-attention distillation mechanism in the Informer model to allow the decoder of each layer to shorten the length of the input sequence by half, thus greatly saving the encoder memory overhead, and this paper adds a rolling long-term prediction function, making the prediction decoding time extremely short. Taking the power load data of a certain province in China as a test case, the improved model Informer* was compared with the traditional Informer model. The results show that the prediction accuracy of the Informer* model is higher and the load prediction accuracy is effectively improved.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call