Abstract
Due to inductive biases, existing Convolution Neural Network (CNN) architecture enables sample-efficient learning from the time series data and becomes the mainstream method for time series classification in Laser Welding Process Monitoring (LWPM). However, it poses an inherent challenge for capturing long-range dependencies in time series classification. Transformer-based architecture can leverage more flexible Self-attention layers to capture long-range dependencies within the long time series data. However, it requires costly training on larger datasets. In this paper, we propose a combined model of Convolutional layers with Window-based Transformer blocks (ConvWT) for time series classification. This ConvWT takes advantage of both architectures while avoiding their respective limitations. To evaluate the performance of the ConvWT, we have collected extensive time series of Plasma-Light-Temperature (PLT) dataset for classification task in LWPM and conducted extensive experiments, including the ablation studies. These experiments demonstrate that the ConvWT achieves better performance over existing CNN and Transformer methods.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have