Abstract

Time-series forecasting (TSF) is a traditional problem in the field of artificial intelligence, and models such as recurrent neural network, long short-term memory, and gate recurrent units have contributed to improving its predictive accuracy. Furthermore, model structures have been proposed to combine time-series decomposition methods such as seasonal-trend decomposition using LOESS. However, this approach is learned in an independent model for each component, and therefore, it cannot learn the relationships between the time-series components. In this study, we propose a new neural architecture called a correlation recurrent unit (CRU) that can perform time-series decomposition within a neural cell and learn correlations (autocorrelation and correlation) between each decomposition component. The proposed neural architecture was evaluated through comparative experiments with previous studies using four univariate and four multivariate time-series datasets. The results showed that long- and short-term predictive performance was improved by more than 10%. The experimental results indicate that the proposed CRU is an excellent method for TSF problems compared to other neural architectures.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.