Abstract

Forecasting exchange rates is challenging due to its diverse features and complex patterns. Inspired the theory of receptive fields, we proposed two models: the Transformer with Convolutional Neural Network (CNN-Transformer) and the Long Short Term Memory networks with Convolutional Neural Network (CNN-LSTM). These models leverage Convolutional Neural Network (CNN) modules to expand the learnable timestep of Long Short Term Memory networks (LSTM) and reduce the complexity of Transformer. Generally, CNN-LSTM demonstrates the highest predictive accuracy. Furthermore, there is a silver lining: the sentiment in past news is closely linked to future exchange rates. In order to introduce the news sentiment to improve the performance of models, we fine-tune a pre-trained model, Bidirectional Encoder Representations from Transformers with Whole Word Masking (BERT-WWM), to extract the sentiment information from news dataset. However, the effectiveness of introducing news sentiment is greatly dependent on the timesteps. Longer timesteps often increase the likelihood of success for this method. We believe this pattern relates to the timeliness of news and the delayed impact of news events. Also, it's important to note that the content of news text significantly influences the forecasting performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call