Abstract

Due to the volatility and randomness of the photovoltaic power generation, it is difficult for traditional models to predict it accurately. To solve the problem, we established a model based on the self-attention mechanism and multi-task learning to predict the ultra-short-term photovoltaic power generation. First, we selected the data with the optimal timing length and input the data into the Encoder-Decoder network based on the self-attention. The validity of features extracted by the encoder was checked by the decoder. Then, we added a restriction to the middle layer of the Encoder-Decoder network to prevent the autoencoder from copying the input to the output mechanically. This condition is used to predict the photovoltaic power generation, so a multi-task learning model was established. Finally, to take full advantage of the features that are efficiently expressed and allow our main task, the prediction task, to learn some unique features autonomously, we proposed a step-by-step training method and have validated the effectiveness of this view in experiments. Through experimental contrast, it is found that compared with the Encoder-Decoder network based on CNN and LSTM, the performance of the proposed method has been increased by 14.82% and 8.09% respectively. The RMSE and MAE of the Encoder-Decoder model based on the self-attention mechanism using step-by-step training are 0.071 and 0.040 respectively.

Highlights

  • Solar energy is a new energy with zero emissions, pollutionfree and inexhaustible [1]

  • We added two simple neural network models as comparative experiments, which are extreme learning machine (ELM) and DNN; 2) CNN, LSTM and self-attention were used as the component of the encoders to extract features and perform single-task learning, that is, the model will only predict the photovoltaic power generation

  • Compared with models based on LSTM components, the RMSE of models employed the selfattention components decreased by 4.82%, 8.31% and 8.09%

Read more

Summary

INTRODUCTION

Solar energy is a new energy with zero emissions, pollutionfree and inexhaustible [1]. The main innovations of this paper are as follows: 1) The self-attention mechanism was applied in the field of photovoltaic power generation for the first time It exerts superiority in capturing long-distance sequence information quickly and effectively and achieves better results than CNN and RNN; 2) A multi-task learning model based on the EncoderDecoder structure was designed. The decoder was used to restore the information extracted by the encoder They cooperated with each other and have enhanced the reliability and accuracy of the prediction results; 3) The step-by-step training method is used in our model. Vaswani et al [33] proposed the Transformer model, which abandoned the traditional method using recurrent neural networks or convolutional neural networks as the encoder and decoder, completely used the attention mechanism to transfer information between different network layers, which is called self-attention and achieved good results in the NLP field. The method of computing attention in the self-attention mechanism can be defined as formula 4

ENCODER-DECODER STRUCTURE
Findings
CONCLUSION
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call