Abstract

Large transformer model had achieved good results in many tasks, such as computer vision (CV) and natural language processing (NLP). However, in financial domains, the application of large deep learning models is rarely observed. Stock Trend Prediction (STP) is a task that using Limit Order Books (LOBs) to predict the future stock price trend by the sequence of historical limit order information, the trend can be Current works are mostly based on the structure of Convolutional Neural Network (CNN) + Recurrent Neural Networks (RNN). This structure is hard to parallel and cannot make full use of GPU resources. It is also difficult to increase the dimension to fit more complex data and performs poor when time sequence is long. Recently, some works proposed that CNN + Transformer model can also work is solving this task. This paper verifies that Transformer can be directly used into STP task and gain a good result, and proposes a novel Transformer-based model, Transformer-LOB, to enhance the basic transformer model performance. This model uses attention mechanisms to extract temporal information rather than using RNN, which utilizes the GPU effectively. Since all the feature extractions are based on transformer modules, the model is scalable and easy to parallel. Transformer-LOB is tested on FI-2010 LOB dataset and SZ-2015 LOB dataset, and outputs ideal results on both datasets.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call