Abstract

In real world, the long sequence prediction of industry kinetic energy index is crucial, which can provide reference for business expansion and risk prevention in the company energy service. However, the existing time-series prediction methods suffer from high time and space cost, the decline of prediction speed, and the difficulty in solving long-term dependence under the long time-series forcast setting. In view of this, we propose a Stacked Transformer consists of stacked encoder and a sequence decoder. Specifically, we split the long time-series sequence into several short sequences, then learn the local and global information of the sequence by stacked encoder to solve the long-range dependency problem. The short sequence contains the position, time, and feature information of the sequence. In addition, fully connected layer efficiently predicts future sequence by one forward operation rather than a step-by-step way.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call